Oct 10 05:04:58 np0005479823 kernel: Linux version 5.14.0-621.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-11), GNU ld version 2.35.2-67.el9) #1 SMP PREEMPT_DYNAMIC Tue Sep 30 07:37:35 UTC 2025
Oct 10 05:04:58 np0005479823 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Oct 10 05:04:58 np0005479823 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-621.el9.x86_64 root=UUID=9839e2e1-98a2-4594-b609-79d514deb0a3 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct 10 05:04:58 np0005479823 kernel: BIOS-provided physical RAM map:
Oct 10 05:04:58 np0005479823 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Oct 10 05:04:58 np0005479823 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Oct 10 05:04:58 np0005479823 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Oct 10 05:04:58 np0005479823 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Oct 10 05:04:58 np0005479823 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Oct 10 05:04:58 np0005479823 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Oct 10 05:04:58 np0005479823 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Oct 10 05:04:58 np0005479823 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Oct 10 05:04:58 np0005479823 kernel: NX (Execute Disable) protection: active
Oct 10 05:04:58 np0005479823 kernel: APIC: Static calls initialized
Oct 10 05:04:58 np0005479823 kernel: SMBIOS 2.8 present.
Oct 10 05:04:58 np0005479823 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Oct 10 05:04:58 np0005479823 kernel: Hypervisor detected: KVM
Oct 10 05:04:58 np0005479823 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Oct 10 05:04:58 np0005479823 kernel: kvm-clock: using sched offset of 4561135004 cycles
Oct 10 05:04:58 np0005479823 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Oct 10 05:04:58 np0005479823 kernel: tsc: Detected 2800.000 MHz processor
Oct 10 05:04:58 np0005479823 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Oct 10 05:04:58 np0005479823 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Oct 10 05:04:58 np0005479823 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Oct 10 05:04:58 np0005479823 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Oct 10 05:04:58 np0005479823 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Oct 10 05:04:58 np0005479823 kernel: Using GB pages for direct mapping
Oct 10 05:04:58 np0005479823 kernel: RAMDISK: [mem 0x2d858000-0x32c23fff]
Oct 10 05:04:58 np0005479823 kernel: ACPI: Early table checksum verification disabled
Oct 10 05:04:58 np0005479823 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Oct 10 05:04:58 np0005479823 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 10 05:04:58 np0005479823 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 10 05:04:58 np0005479823 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 10 05:04:58 np0005479823 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Oct 10 05:04:58 np0005479823 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 10 05:04:58 np0005479823 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 10 05:04:58 np0005479823 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Oct 10 05:04:58 np0005479823 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Oct 10 05:04:58 np0005479823 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Oct 10 05:04:58 np0005479823 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Oct 10 05:04:58 np0005479823 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Oct 10 05:04:58 np0005479823 kernel: No NUMA configuration found
Oct 10 05:04:58 np0005479823 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Oct 10 05:04:58 np0005479823 kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Oct 10 05:04:58 np0005479823 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Oct 10 05:04:58 np0005479823 kernel: Zone ranges:
Oct 10 05:04:58 np0005479823 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Oct 10 05:04:58 np0005479823 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Oct 10 05:04:58 np0005479823 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Oct 10 05:04:58 np0005479823 kernel:  Device   empty
Oct 10 05:04:58 np0005479823 kernel: Movable zone start for each node
Oct 10 05:04:58 np0005479823 kernel: Early memory node ranges
Oct 10 05:04:58 np0005479823 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Oct 10 05:04:58 np0005479823 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Oct 10 05:04:58 np0005479823 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Oct 10 05:04:58 np0005479823 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Oct 10 05:04:58 np0005479823 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Oct 10 05:04:58 np0005479823 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Oct 10 05:04:58 np0005479823 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Oct 10 05:04:58 np0005479823 kernel: ACPI: PM-Timer IO Port: 0x608
Oct 10 05:04:58 np0005479823 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Oct 10 05:04:58 np0005479823 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Oct 10 05:04:58 np0005479823 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Oct 10 05:04:58 np0005479823 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Oct 10 05:04:58 np0005479823 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Oct 10 05:04:58 np0005479823 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Oct 10 05:04:58 np0005479823 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Oct 10 05:04:58 np0005479823 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Oct 10 05:04:58 np0005479823 kernel: TSC deadline timer available
Oct 10 05:04:58 np0005479823 kernel: CPU topo: Max. logical packages:   8
Oct 10 05:04:58 np0005479823 kernel: CPU topo: Max. logical dies:       8
Oct 10 05:04:58 np0005479823 kernel: CPU topo: Max. dies per package:   1
Oct 10 05:04:58 np0005479823 kernel: CPU topo: Max. threads per core:   1
Oct 10 05:04:58 np0005479823 kernel: CPU topo: Num. cores per package:     1
Oct 10 05:04:58 np0005479823 kernel: CPU topo: Num. threads per package:   1
Oct 10 05:04:58 np0005479823 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Oct 10 05:04:58 np0005479823 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Oct 10 05:04:58 np0005479823 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Oct 10 05:04:58 np0005479823 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Oct 10 05:04:58 np0005479823 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Oct 10 05:04:58 np0005479823 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Oct 10 05:04:58 np0005479823 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Oct 10 05:04:58 np0005479823 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Oct 10 05:04:58 np0005479823 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Oct 10 05:04:58 np0005479823 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Oct 10 05:04:58 np0005479823 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Oct 10 05:04:58 np0005479823 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Oct 10 05:04:58 np0005479823 kernel: Booting paravirtualized kernel on KVM
Oct 10 05:04:58 np0005479823 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Oct 10 05:04:58 np0005479823 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Oct 10 05:04:58 np0005479823 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Oct 10 05:04:58 np0005479823 kernel: kvm-guest: PV spinlocks disabled, no host support
Oct 10 05:04:58 np0005479823 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-621.el9.x86_64 root=UUID=9839e2e1-98a2-4594-b609-79d514deb0a3 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct 10 05:04:58 np0005479823 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-621.el9.x86_64", will be passed to user space.
Oct 10 05:04:58 np0005479823 kernel: random: crng init done
Oct 10 05:04:58 np0005479823 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Oct 10 05:04:58 np0005479823 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Oct 10 05:04:58 np0005479823 kernel: Fallback order for Node 0: 0 
Oct 10 05:04:58 np0005479823 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Oct 10 05:04:58 np0005479823 kernel: Policy zone: Normal
Oct 10 05:04:58 np0005479823 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Oct 10 05:04:58 np0005479823 kernel: software IO TLB: area num 8.
Oct 10 05:04:58 np0005479823 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Oct 10 05:04:58 np0005479823 kernel: ftrace: allocating 49162 entries in 193 pages
Oct 10 05:04:58 np0005479823 kernel: ftrace: allocated 193 pages with 3 groups
Oct 10 05:04:58 np0005479823 kernel: Dynamic Preempt: voluntary
Oct 10 05:04:58 np0005479823 kernel: rcu: Preemptible hierarchical RCU implementation.
Oct 10 05:04:58 np0005479823 kernel: rcu: #011RCU event tracing is enabled.
Oct 10 05:04:58 np0005479823 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Oct 10 05:04:58 np0005479823 kernel: #011Trampoline variant of Tasks RCU enabled.
Oct 10 05:04:58 np0005479823 kernel: #011Rude variant of Tasks RCU enabled.
Oct 10 05:04:58 np0005479823 kernel: #011Tracing variant of Tasks RCU enabled.
Oct 10 05:04:58 np0005479823 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Oct 10 05:04:58 np0005479823 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Oct 10 05:04:58 np0005479823 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct 10 05:04:58 np0005479823 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct 10 05:04:58 np0005479823 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct 10 05:04:58 np0005479823 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Oct 10 05:04:58 np0005479823 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Oct 10 05:04:58 np0005479823 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Oct 10 05:04:58 np0005479823 kernel: Console: colour VGA+ 80x25
Oct 10 05:04:58 np0005479823 kernel: printk: console [ttyS0] enabled
Oct 10 05:04:58 np0005479823 kernel: ACPI: Core revision 20230331
Oct 10 05:04:58 np0005479823 kernel: APIC: Switch to symmetric I/O mode setup
Oct 10 05:04:58 np0005479823 kernel: x2apic enabled
Oct 10 05:04:58 np0005479823 kernel: APIC: Switched APIC routing to: physical x2apic
Oct 10 05:04:58 np0005479823 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Oct 10 05:04:58 np0005479823 kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Oct 10 05:04:58 np0005479823 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Oct 10 05:04:58 np0005479823 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Oct 10 05:04:58 np0005479823 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Oct 10 05:04:58 np0005479823 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Oct 10 05:04:58 np0005479823 kernel: Spectre V2 : Mitigation: Retpolines
Oct 10 05:04:58 np0005479823 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Oct 10 05:04:58 np0005479823 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Oct 10 05:04:58 np0005479823 kernel: RETBleed: Mitigation: untrained return thunk
Oct 10 05:04:58 np0005479823 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Oct 10 05:04:58 np0005479823 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Oct 10 05:04:58 np0005479823 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Oct 10 05:04:58 np0005479823 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Oct 10 05:04:58 np0005479823 kernel: x86/bugs: return thunk changed
Oct 10 05:04:58 np0005479823 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Oct 10 05:04:58 np0005479823 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Oct 10 05:04:58 np0005479823 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Oct 10 05:04:58 np0005479823 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Oct 10 05:04:58 np0005479823 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Oct 10 05:04:58 np0005479823 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Oct 10 05:04:58 np0005479823 kernel: Freeing SMP alternatives memory: 40K
Oct 10 05:04:58 np0005479823 kernel: pid_max: default: 32768 minimum: 301
Oct 10 05:04:58 np0005479823 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Oct 10 05:04:58 np0005479823 kernel: landlock: Up and running.
Oct 10 05:04:58 np0005479823 kernel: Yama: becoming mindful.
Oct 10 05:04:58 np0005479823 kernel: SELinux:  Initializing.
Oct 10 05:04:58 np0005479823 kernel: LSM support for eBPF active
Oct 10 05:04:58 np0005479823 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Oct 10 05:04:58 np0005479823 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Oct 10 05:04:58 np0005479823 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Oct 10 05:04:58 np0005479823 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Oct 10 05:04:58 np0005479823 kernel: ... version:                0
Oct 10 05:04:58 np0005479823 kernel: ... bit width:              48
Oct 10 05:04:58 np0005479823 kernel: ... generic registers:      6
Oct 10 05:04:58 np0005479823 kernel: ... value mask:             0000ffffffffffff
Oct 10 05:04:58 np0005479823 kernel: ... max period:             00007fffffffffff
Oct 10 05:04:58 np0005479823 kernel: ... fixed-purpose events:   0
Oct 10 05:04:58 np0005479823 kernel: ... event mask:             000000000000003f
Oct 10 05:04:58 np0005479823 kernel: signal: max sigframe size: 1776
Oct 10 05:04:58 np0005479823 kernel: rcu: Hierarchical SRCU implementation.
Oct 10 05:04:58 np0005479823 kernel: rcu: #011Max phase no-delay instances is 400.
Oct 10 05:04:58 np0005479823 kernel: smp: Bringing up secondary CPUs ...
Oct 10 05:04:58 np0005479823 kernel: smpboot: x86: Booting SMP configuration:
Oct 10 05:04:58 np0005479823 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Oct 10 05:04:58 np0005479823 kernel: smp: Brought up 1 node, 8 CPUs
Oct 10 05:04:58 np0005479823 kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Oct 10 05:04:58 np0005479823 kernel: node 0 deferred pages initialised in 15ms
Oct 10 05:04:58 np0005479823 kernel: Memory: 7765864K/8388068K available (16384K kernel code, 5784K rwdata, 13864K rodata, 4188K init, 7196K bss, 616208K reserved, 0K cma-reserved)
Oct 10 05:04:58 np0005479823 kernel: devtmpfs: initialized
Oct 10 05:04:58 np0005479823 kernel: x86/mm: Memory block size: 128MB
Oct 10 05:04:58 np0005479823 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Oct 10 05:04:58 np0005479823 kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Oct 10 05:04:58 np0005479823 kernel: pinctrl core: initialized pinctrl subsystem
Oct 10 05:04:58 np0005479823 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Oct 10 05:04:58 np0005479823 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Oct 10 05:04:58 np0005479823 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Oct 10 05:04:58 np0005479823 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Oct 10 05:04:58 np0005479823 kernel: audit: initializing netlink subsys (disabled)
Oct 10 05:04:58 np0005479823 kernel: audit: type=2000 audit(1760087096.570:1): state=initialized audit_enabled=0 res=1
Oct 10 05:04:58 np0005479823 kernel: thermal_sys: Registered thermal governor 'fair_share'
Oct 10 05:04:58 np0005479823 kernel: thermal_sys: Registered thermal governor 'step_wise'
Oct 10 05:04:58 np0005479823 kernel: thermal_sys: Registered thermal governor 'user_space'
Oct 10 05:04:58 np0005479823 kernel: cpuidle: using governor menu
Oct 10 05:04:58 np0005479823 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Oct 10 05:04:58 np0005479823 kernel: PCI: Using configuration type 1 for base access
Oct 10 05:04:58 np0005479823 kernel: PCI: Using configuration type 1 for extended access
Oct 10 05:04:58 np0005479823 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Oct 10 05:04:58 np0005479823 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Oct 10 05:04:58 np0005479823 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Oct 10 05:04:58 np0005479823 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Oct 10 05:04:58 np0005479823 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Oct 10 05:04:58 np0005479823 kernel: Demotion targets for Node 0: null
Oct 10 05:04:58 np0005479823 kernel: cryptd: max_cpu_qlen set to 1000
Oct 10 05:04:58 np0005479823 kernel: ACPI: Added _OSI(Module Device)
Oct 10 05:04:58 np0005479823 kernel: ACPI: Added _OSI(Processor Device)
Oct 10 05:04:58 np0005479823 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Oct 10 05:04:58 np0005479823 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Oct 10 05:04:58 np0005479823 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Oct 10 05:04:58 np0005479823 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Oct 10 05:04:58 np0005479823 kernel: ACPI: Interpreter enabled
Oct 10 05:04:58 np0005479823 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Oct 10 05:04:58 np0005479823 kernel: ACPI: Using IOAPIC for interrupt routing
Oct 10 05:04:58 np0005479823 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Oct 10 05:04:58 np0005479823 kernel: PCI: Using E820 reservations for host bridge windows
Oct 10 05:04:58 np0005479823 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Oct 10 05:04:58 np0005479823 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Oct 10 05:04:58 np0005479823 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Oct 10 05:04:58 np0005479823 kernel: acpiphp: Slot [3] registered
Oct 10 05:04:58 np0005479823 kernel: acpiphp: Slot [4] registered
Oct 10 05:04:58 np0005479823 kernel: acpiphp: Slot [5] registered
Oct 10 05:04:58 np0005479823 kernel: acpiphp: Slot [6] registered
Oct 10 05:04:58 np0005479823 kernel: acpiphp: Slot [7] registered
Oct 10 05:04:58 np0005479823 kernel: acpiphp: Slot [8] registered
Oct 10 05:04:58 np0005479823 kernel: acpiphp: Slot [9] registered
Oct 10 05:04:58 np0005479823 kernel: acpiphp: Slot [10] registered
Oct 10 05:04:58 np0005479823 kernel: acpiphp: Slot [11] registered
Oct 10 05:04:58 np0005479823 kernel: acpiphp: Slot [12] registered
Oct 10 05:04:58 np0005479823 kernel: acpiphp: Slot [13] registered
Oct 10 05:04:58 np0005479823 kernel: acpiphp: Slot [14] registered
Oct 10 05:04:58 np0005479823 kernel: acpiphp: Slot [15] registered
Oct 10 05:04:58 np0005479823 kernel: acpiphp: Slot [16] registered
Oct 10 05:04:58 np0005479823 kernel: acpiphp: Slot [17] registered
Oct 10 05:04:58 np0005479823 kernel: acpiphp: Slot [18] registered
Oct 10 05:04:58 np0005479823 kernel: acpiphp: Slot [19] registered
Oct 10 05:04:58 np0005479823 kernel: acpiphp: Slot [20] registered
Oct 10 05:04:58 np0005479823 kernel: acpiphp: Slot [21] registered
Oct 10 05:04:58 np0005479823 kernel: acpiphp: Slot [22] registered
Oct 10 05:04:58 np0005479823 kernel: acpiphp: Slot [23] registered
Oct 10 05:04:58 np0005479823 kernel: acpiphp: Slot [24] registered
Oct 10 05:04:58 np0005479823 kernel: acpiphp: Slot [25] registered
Oct 10 05:04:58 np0005479823 kernel: acpiphp: Slot [26] registered
Oct 10 05:04:58 np0005479823 kernel: acpiphp: Slot [27] registered
Oct 10 05:04:58 np0005479823 kernel: acpiphp: Slot [28] registered
Oct 10 05:04:58 np0005479823 kernel: acpiphp: Slot [29] registered
Oct 10 05:04:58 np0005479823 kernel: acpiphp: Slot [30] registered
Oct 10 05:04:58 np0005479823 kernel: acpiphp: Slot [31] registered
Oct 10 05:04:58 np0005479823 kernel: PCI host bridge to bus 0000:00
Oct 10 05:04:58 np0005479823 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Oct 10 05:04:58 np0005479823 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Oct 10 05:04:58 np0005479823 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Oct 10 05:04:58 np0005479823 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Oct 10 05:04:58 np0005479823 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Oct 10 05:04:58 np0005479823 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Oct 10 05:04:58 np0005479823 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Oct 10 05:04:58 np0005479823 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Oct 10 05:04:58 np0005479823 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Oct 10 05:04:58 np0005479823 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Oct 10 05:04:58 np0005479823 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Oct 10 05:04:58 np0005479823 kernel: iommu: Default domain type: Translated
Oct 10 05:04:58 np0005479823 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Oct 10 05:04:58 np0005479823 kernel: SCSI subsystem initialized
Oct 10 05:04:58 np0005479823 kernel: ACPI: bus type USB registered
Oct 10 05:04:58 np0005479823 kernel: usbcore: registered new interface driver usbfs
Oct 10 05:04:58 np0005479823 kernel: usbcore: registered new interface driver hub
Oct 10 05:04:58 np0005479823 kernel: usbcore: registered new device driver usb
Oct 10 05:04:58 np0005479823 kernel: pps_core: LinuxPPS API ver. 1 registered
Oct 10 05:04:58 np0005479823 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Oct 10 05:04:58 np0005479823 kernel: PTP clock support registered
Oct 10 05:04:58 np0005479823 kernel: EDAC MC: Ver: 3.0.0
Oct 10 05:04:58 np0005479823 kernel: NetLabel: Initializing
Oct 10 05:04:58 np0005479823 kernel: NetLabel:  domain hash size = 128
Oct 10 05:04:58 np0005479823 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Oct 10 05:04:58 np0005479823 kernel: NetLabel:  unlabeled traffic allowed by default
Oct 10 05:04:58 np0005479823 kernel: PCI: Using ACPI for IRQ routing
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Oct 10 05:04:58 np0005479823 kernel: vgaarb: loaded
Oct 10 05:04:58 np0005479823 kernel: clocksource: Switched to clocksource kvm-clock
Oct 10 05:04:58 np0005479823 kernel: VFS: Disk quotas dquot_6.6.0
Oct 10 05:04:58 np0005479823 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Oct 10 05:04:58 np0005479823 kernel: pnp: PnP ACPI init
Oct 10 05:04:58 np0005479823 kernel: pnp: PnP ACPI: found 5 devices
Oct 10 05:04:58 np0005479823 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Oct 10 05:04:58 np0005479823 kernel: NET: Registered PF_INET protocol family
Oct 10 05:04:58 np0005479823 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Oct 10 05:04:58 np0005479823 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Oct 10 05:04:58 np0005479823 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Oct 10 05:04:58 np0005479823 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Oct 10 05:04:58 np0005479823 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Oct 10 05:04:58 np0005479823 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Oct 10 05:04:58 np0005479823 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Oct 10 05:04:58 np0005479823 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Oct 10 05:04:58 np0005479823 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Oct 10 05:04:58 np0005479823 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Oct 10 05:04:58 np0005479823 kernel: NET: Registered PF_XDP protocol family
Oct 10 05:04:58 np0005479823 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Oct 10 05:04:58 np0005479823 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Oct 10 05:04:58 np0005479823 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Oct 10 05:04:58 np0005479823 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Oct 10 05:04:58 np0005479823 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Oct 10 05:04:58 np0005479823 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Oct 10 05:04:58 np0005479823 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 82018 usecs
Oct 10 05:04:58 np0005479823 kernel: PCI: CLS 0 bytes, default 64
Oct 10 05:04:58 np0005479823 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Oct 10 05:04:58 np0005479823 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Oct 10 05:04:58 np0005479823 kernel: Trying to unpack rootfs image as initramfs...
Oct 10 05:04:58 np0005479823 kernel: ACPI: bus type thunderbolt registered
Oct 10 05:04:58 np0005479823 kernel: Initialise system trusted keyrings
Oct 10 05:04:58 np0005479823 kernel: Key type blacklist registered
Oct 10 05:04:58 np0005479823 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Oct 10 05:04:58 np0005479823 kernel: zbud: loaded
Oct 10 05:04:58 np0005479823 kernel: integrity: Platform Keyring initialized
Oct 10 05:04:58 np0005479823 kernel: integrity: Machine keyring initialized
Oct 10 05:04:58 np0005479823 kernel: Freeing initrd memory: 85808K
Oct 10 05:04:58 np0005479823 kernel: NET: Registered PF_ALG protocol family
Oct 10 05:04:58 np0005479823 kernel: xor: automatically using best checksumming function   avx       
Oct 10 05:04:58 np0005479823 kernel: Key type asymmetric registered
Oct 10 05:04:58 np0005479823 kernel: Asymmetric key parser 'x509' registered
Oct 10 05:04:58 np0005479823 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Oct 10 05:04:58 np0005479823 kernel: io scheduler mq-deadline registered
Oct 10 05:04:58 np0005479823 kernel: io scheduler kyber registered
Oct 10 05:04:58 np0005479823 kernel: io scheduler bfq registered
Oct 10 05:04:58 np0005479823 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Oct 10 05:04:58 np0005479823 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Oct 10 05:04:58 np0005479823 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Oct 10 05:04:58 np0005479823 kernel: ACPI: button: Power Button [PWRF]
Oct 10 05:04:58 np0005479823 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Oct 10 05:04:58 np0005479823 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Oct 10 05:04:58 np0005479823 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Oct 10 05:04:58 np0005479823 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Oct 10 05:04:58 np0005479823 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Oct 10 05:04:58 np0005479823 kernel: Non-volatile memory driver v1.3
Oct 10 05:04:58 np0005479823 kernel: rdac: device handler registered
Oct 10 05:04:58 np0005479823 kernel: hp_sw: device handler registered
Oct 10 05:04:58 np0005479823 kernel: emc: device handler registered
Oct 10 05:04:58 np0005479823 kernel: alua: device handler registered
Oct 10 05:04:58 np0005479823 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Oct 10 05:04:58 np0005479823 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Oct 10 05:04:58 np0005479823 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Oct 10 05:04:58 np0005479823 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Oct 10 05:04:58 np0005479823 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Oct 10 05:04:58 np0005479823 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Oct 10 05:04:58 np0005479823 kernel: usb usb1: Product: UHCI Host Controller
Oct 10 05:04:58 np0005479823 kernel: usb usb1: Manufacturer: Linux 5.14.0-621.el9.x86_64 uhci_hcd
Oct 10 05:04:58 np0005479823 kernel: usb usb1: SerialNumber: 0000:00:01.2
Oct 10 05:04:58 np0005479823 kernel: hub 1-0:1.0: USB hub found
Oct 10 05:04:58 np0005479823 kernel: hub 1-0:1.0: 2 ports detected
Oct 10 05:04:58 np0005479823 kernel: usbcore: registered new interface driver usbserial_generic
Oct 10 05:04:58 np0005479823 kernel: usbserial: USB Serial support registered for generic
Oct 10 05:04:58 np0005479823 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Oct 10 05:04:58 np0005479823 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Oct 10 05:04:58 np0005479823 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Oct 10 05:04:58 np0005479823 kernel: mousedev: PS/2 mouse device common for all mice
Oct 10 05:04:58 np0005479823 kernel: rtc_cmos 00:04: RTC can wake from S4
Oct 10 05:04:58 np0005479823 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Oct 10 05:04:58 np0005479823 kernel: rtc_cmos 00:04: registered as rtc0
Oct 10 05:04:58 np0005479823 kernel: rtc_cmos 00:04: setting system clock to 2025-10-10T09:04:57 UTC (1760087097)
Oct 10 05:04:58 np0005479823 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Oct 10 05:04:58 np0005479823 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Oct 10 05:04:58 np0005479823 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Oct 10 05:04:58 np0005479823 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Oct 10 05:04:58 np0005479823 kernel: hid: raw HID events driver (C) Jiri Kosina
Oct 10 05:04:58 np0005479823 kernel: usbcore: registered new interface driver usbhid
Oct 10 05:04:58 np0005479823 kernel: usbhid: USB HID core driver
Oct 10 05:04:58 np0005479823 kernel: drop_monitor: Initializing network drop monitor service
Oct 10 05:04:58 np0005479823 kernel: Initializing XFRM netlink socket
Oct 10 05:04:58 np0005479823 kernel: NET: Registered PF_INET6 protocol family
Oct 10 05:04:58 np0005479823 kernel: Segment Routing with IPv6
Oct 10 05:04:58 np0005479823 kernel: NET: Registered PF_PACKET protocol family
Oct 10 05:04:58 np0005479823 kernel: mpls_gso: MPLS GSO support
Oct 10 05:04:58 np0005479823 kernel: IPI shorthand broadcast: enabled
Oct 10 05:04:58 np0005479823 kernel: AVX2 version of gcm_enc/dec engaged.
Oct 10 05:04:58 np0005479823 kernel: AES CTR mode by8 optimization enabled
Oct 10 05:04:58 np0005479823 kernel: sched_clock: Marking stable (1148024068, 149235727)->(1414112165, -116852370)
Oct 10 05:04:58 np0005479823 kernel: registered taskstats version 1
Oct 10 05:04:58 np0005479823 kernel: Loading compiled-in X.509 certificates
Oct 10 05:04:58 np0005479823 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 72f99a463516b0dfb027e50caab189f607ef1bc9'
Oct 10 05:04:58 np0005479823 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Oct 10 05:04:58 np0005479823 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Oct 10 05:04:58 np0005479823 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Oct 10 05:04:58 np0005479823 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Oct 10 05:04:58 np0005479823 kernel: Demotion targets for Node 0: null
Oct 10 05:04:58 np0005479823 kernel: page_owner is disabled
Oct 10 05:04:58 np0005479823 kernel: Key type .fscrypt registered
Oct 10 05:04:58 np0005479823 kernel: Key type fscrypt-provisioning registered
Oct 10 05:04:58 np0005479823 kernel: Key type big_key registered
Oct 10 05:04:58 np0005479823 kernel: Key type encrypted registered
Oct 10 05:04:58 np0005479823 kernel: ima: No TPM chip found, activating TPM-bypass!
Oct 10 05:04:58 np0005479823 kernel: Loading compiled-in module X.509 certificates
Oct 10 05:04:58 np0005479823 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 72f99a463516b0dfb027e50caab189f607ef1bc9'
Oct 10 05:04:58 np0005479823 kernel: ima: Allocated hash algorithm: sha256
Oct 10 05:04:58 np0005479823 kernel: ima: No architecture policies found
Oct 10 05:04:58 np0005479823 kernel: evm: Initialising EVM extended attributes:
Oct 10 05:04:58 np0005479823 kernel: evm: security.selinux
Oct 10 05:04:58 np0005479823 kernel: evm: security.SMACK64 (disabled)
Oct 10 05:04:58 np0005479823 kernel: evm: security.SMACK64EXEC (disabled)
Oct 10 05:04:58 np0005479823 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Oct 10 05:04:58 np0005479823 kernel: evm: security.SMACK64MMAP (disabled)
Oct 10 05:04:58 np0005479823 kernel: evm: security.apparmor (disabled)
Oct 10 05:04:58 np0005479823 kernel: evm: security.ima
Oct 10 05:04:58 np0005479823 kernel: evm: security.capability
Oct 10 05:04:58 np0005479823 kernel: evm: HMAC attrs: 0x1
Oct 10 05:04:58 np0005479823 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Oct 10 05:04:58 np0005479823 kernel: Running certificate verification RSA selftest
Oct 10 05:04:58 np0005479823 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Oct 10 05:04:58 np0005479823 kernel: Running certificate verification ECDSA selftest
Oct 10 05:04:58 np0005479823 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Oct 10 05:04:58 np0005479823 kernel: clk: Disabling unused clocks
Oct 10 05:04:58 np0005479823 kernel: Freeing unused decrypted memory: 2028K
Oct 10 05:04:58 np0005479823 kernel: Freeing unused kernel image (initmem) memory: 4188K
Oct 10 05:04:58 np0005479823 kernel: Write protecting the kernel read-only data: 30720k
Oct 10 05:04:58 np0005479823 kernel: Freeing unused kernel image (rodata/data gap) memory: 472K
Oct 10 05:04:58 np0005479823 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Oct 10 05:04:58 np0005479823 kernel: Run /init as init process
Oct 10 05:04:58 np0005479823 systemd: systemd 252-57.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct 10 05:04:58 np0005479823 systemd: Detected virtualization kvm.
Oct 10 05:04:58 np0005479823 systemd: Detected architecture x86-64.
Oct 10 05:04:58 np0005479823 systemd: Running in initrd.
Oct 10 05:04:58 np0005479823 systemd: No hostname configured, using default hostname.
Oct 10 05:04:58 np0005479823 systemd: Hostname set to <localhost>.
Oct 10 05:04:58 np0005479823 systemd: Initializing machine ID from VM UUID.
Oct 10 05:04:58 np0005479823 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Oct 10 05:04:58 np0005479823 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Oct 10 05:04:58 np0005479823 kernel: usb 1-1: Product: QEMU USB Tablet
Oct 10 05:04:58 np0005479823 kernel: usb 1-1: Manufacturer: QEMU
Oct 10 05:04:58 np0005479823 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Oct 10 05:04:58 np0005479823 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Oct 10 05:04:58 np0005479823 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Oct 10 05:04:58 np0005479823 systemd: Queued start job for default target Initrd Default Target.
Oct 10 05:04:58 np0005479823 systemd: Started Dispatch Password Requests to Console Directory Watch.
Oct 10 05:04:58 np0005479823 systemd: Reached target Local Encrypted Volumes.
Oct 10 05:04:58 np0005479823 systemd: Reached target Initrd /usr File System.
Oct 10 05:04:58 np0005479823 systemd: Reached target Local File Systems.
Oct 10 05:04:58 np0005479823 systemd: Reached target Path Units.
Oct 10 05:04:58 np0005479823 systemd: Reached target Slice Units.
Oct 10 05:04:58 np0005479823 systemd: Reached target Swaps.
Oct 10 05:04:58 np0005479823 systemd: Reached target Timer Units.
Oct 10 05:04:58 np0005479823 systemd: Listening on D-Bus System Message Bus Socket.
Oct 10 05:04:58 np0005479823 systemd: Listening on Journal Socket (/dev/log).
Oct 10 05:04:58 np0005479823 systemd: Listening on Journal Socket.
Oct 10 05:04:58 np0005479823 systemd: Listening on udev Control Socket.
Oct 10 05:04:58 np0005479823 systemd: Listening on udev Kernel Socket.
Oct 10 05:04:58 np0005479823 systemd: Reached target Socket Units.
Oct 10 05:04:58 np0005479823 systemd: Starting Create List of Static Device Nodes...
Oct 10 05:04:58 np0005479823 systemd: Starting Journal Service...
Oct 10 05:04:58 np0005479823 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Oct 10 05:04:58 np0005479823 systemd: Starting Apply Kernel Variables...
Oct 10 05:04:58 np0005479823 systemd: Starting Create System Users...
Oct 10 05:04:58 np0005479823 systemd: Starting Setup Virtual Console...
Oct 10 05:04:58 np0005479823 systemd: Finished Create List of Static Device Nodes.
Oct 10 05:04:58 np0005479823 systemd: Finished Apply Kernel Variables.
Oct 10 05:04:58 np0005479823 systemd: Finished Create System Users.
Oct 10 05:04:58 np0005479823 systemd-journald[305]: Journal started
Oct 10 05:04:58 np0005479823 systemd-journald[305]: Runtime Journal (/run/log/journal/55d065af02524401ad6e822a36bead06) is 8.0M, max 153.6M, 145.6M free.
Oct 10 05:04:58 np0005479823 systemd-sysusers[309]: Creating group 'users' with GID 100.
Oct 10 05:04:58 np0005479823 systemd-sysusers[309]: Creating group 'dbus' with GID 81.
Oct 10 05:04:58 np0005479823 systemd-sysusers[309]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Oct 10 05:04:58 np0005479823 systemd: Started Journal Service.
Oct 10 05:04:58 np0005479823 systemd[1]: Starting Create Static Device Nodes in /dev...
Oct 10 05:04:58 np0005479823 systemd[1]: Starting Create Volatile Files and Directories...
Oct 10 05:04:58 np0005479823 systemd[1]: Finished Create Static Device Nodes in /dev.
Oct 10 05:04:58 np0005479823 systemd[1]: Finished Create Volatile Files and Directories.
Oct 10 05:04:58 np0005479823 systemd[1]: Finished Setup Virtual Console.
Oct 10 05:04:58 np0005479823 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Oct 10 05:04:58 np0005479823 systemd[1]: Starting dracut cmdline hook...
Oct 10 05:04:58 np0005479823 dracut-cmdline[322]: dracut-9 dracut-057-102.git20250818.el9
Oct 10 05:04:58 np0005479823 dracut-cmdline[322]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-621.el9.x86_64 root=UUID=9839e2e1-98a2-4594-b609-79d514deb0a3 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct 10 05:04:58 np0005479823 systemd[1]: Finished dracut cmdline hook.
Oct 10 05:04:58 np0005479823 systemd[1]: Starting dracut pre-udev hook...
Oct 10 05:04:58 np0005479823 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Oct 10 05:04:58 np0005479823 kernel: device-mapper: uevent: version 1.0.3
Oct 10 05:04:58 np0005479823 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Oct 10 05:04:58 np0005479823 kernel: RPC: Registered named UNIX socket transport module.
Oct 10 05:04:58 np0005479823 kernel: RPC: Registered udp transport module.
Oct 10 05:04:58 np0005479823 kernel: RPC: Registered tcp transport module.
Oct 10 05:04:58 np0005479823 kernel: RPC: Registered tcp-with-tls transport module.
Oct 10 05:04:58 np0005479823 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Oct 10 05:04:58 np0005479823 rpc.statd[439]: Version 2.5.4 starting
Oct 10 05:04:58 np0005479823 rpc.statd[439]: Initializing NSM state
Oct 10 05:04:58 np0005479823 rpc.idmapd[444]: Setting log level to 0
Oct 10 05:04:58 np0005479823 systemd[1]: Finished dracut pre-udev hook.
Oct 10 05:04:58 np0005479823 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct 10 05:04:58 np0005479823 systemd-udevd[457]: Using default interface naming scheme 'rhel-9.0'.
Oct 10 05:04:58 np0005479823 systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct 10 05:04:58 np0005479823 systemd[1]: Starting dracut pre-trigger hook...
Oct 10 05:04:58 np0005479823 systemd[1]: Finished dracut pre-trigger hook.
Oct 10 05:04:58 np0005479823 systemd[1]: Starting Coldplug All udev Devices...
Oct 10 05:04:58 np0005479823 systemd[1]: Created slice Slice /system/modprobe.
Oct 10 05:04:58 np0005479823 systemd[1]: Starting Load Kernel Module configfs...
Oct 10 05:04:58 np0005479823 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct 10 05:04:58 np0005479823 systemd[1]: Finished Load Kernel Module configfs.
Oct 10 05:04:58 np0005479823 systemd[1]: Finished Coldplug All udev Devices.
Oct 10 05:04:58 np0005479823 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct 10 05:04:58 np0005479823 systemd[1]: Reached target Network.
Oct 10 05:04:58 np0005479823 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct 10 05:04:58 np0005479823 systemd[1]: Starting dracut initqueue hook...
Oct 10 05:04:58 np0005479823 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Oct 10 05:04:58 np0005479823 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Oct 10 05:04:58 np0005479823 kernel: vda: vda1
Oct 10 05:04:58 np0005479823 systemd-udevd[473]: Network interface NamePolicy= disabled on kernel command line.
Oct 10 05:04:58 np0005479823 kernel: scsi host0: ata_piix
Oct 10 05:04:58 np0005479823 kernel: scsi host1: ata_piix
Oct 10 05:04:58 np0005479823 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Oct 10 05:04:58 np0005479823 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Oct 10 05:04:58 np0005479823 systemd[1]: Found device /dev/disk/by-uuid/9839e2e1-98a2-4594-b609-79d514deb0a3.
Oct 10 05:04:59 np0005479823 systemd[1]: Reached target Initrd Root Device.
Oct 10 05:04:59 np0005479823 systemd[1]: Mounting Kernel Configuration File System...
Oct 10 05:04:59 np0005479823 systemd[1]: Mounted Kernel Configuration File System.
Oct 10 05:04:59 np0005479823 systemd[1]: Reached target System Initialization.
Oct 10 05:04:59 np0005479823 kernel: ata1: found unknown device (class 0)
Oct 10 05:04:59 np0005479823 systemd[1]: Reached target Basic System.
Oct 10 05:04:59 np0005479823 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Oct 10 05:04:59 np0005479823 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Oct 10 05:04:59 np0005479823 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Oct 10 05:04:59 np0005479823 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Oct 10 05:04:59 np0005479823 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Oct 10 05:04:59 np0005479823 systemd[1]: Finished dracut initqueue hook.
Oct 10 05:04:59 np0005479823 systemd[1]: Reached target Preparation for Remote File Systems.
Oct 10 05:04:59 np0005479823 systemd[1]: Reached target Remote Encrypted Volumes.
Oct 10 05:04:59 np0005479823 systemd[1]: Reached target Remote File Systems.
Oct 10 05:04:59 np0005479823 systemd[1]: Starting dracut pre-mount hook...
Oct 10 05:04:59 np0005479823 systemd[1]: Finished dracut pre-mount hook.
Oct 10 05:04:59 np0005479823 systemd[1]: Starting File System Check on /dev/disk/by-uuid/9839e2e1-98a2-4594-b609-79d514deb0a3...
Oct 10 05:04:59 np0005479823 systemd-fsck[556]: /usr/sbin/fsck.xfs: XFS file system.
Oct 10 05:04:59 np0005479823 systemd[1]: Finished File System Check on /dev/disk/by-uuid/9839e2e1-98a2-4594-b609-79d514deb0a3.
Oct 10 05:04:59 np0005479823 systemd[1]: Mounting /sysroot...
Oct 10 05:04:59 np0005479823 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Oct 10 05:04:59 np0005479823 kernel: XFS (vda1): Mounting V5 Filesystem 9839e2e1-98a2-4594-b609-79d514deb0a3
Oct 10 05:04:59 np0005479823 kernel: XFS (vda1): Ending clean mount
Oct 10 05:04:59 np0005479823 systemd[1]: Mounted /sysroot.
Oct 10 05:04:59 np0005479823 systemd[1]: Reached target Initrd Root File System.
Oct 10 05:04:59 np0005479823 systemd[1]: Starting Mountpoints Configured in the Real Root...
Oct 10 05:04:59 np0005479823 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Oct 10 05:04:59 np0005479823 systemd[1]: Finished Mountpoints Configured in the Real Root.
Oct 10 05:04:59 np0005479823 systemd[1]: Reached target Initrd File Systems.
Oct 10 05:04:59 np0005479823 systemd[1]: Reached target Initrd Default Target.
Oct 10 05:04:59 np0005479823 systemd[1]: Starting dracut mount hook...
Oct 10 05:04:59 np0005479823 systemd[1]: Finished dracut mount hook.
Oct 10 05:04:59 np0005479823 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Oct 10 05:04:59 np0005479823 rpc.idmapd[444]: exiting on signal 15
Oct 10 05:05:00 np0005479823 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Oct 10 05:05:00 np0005479823 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Oct 10 05:05:00 np0005479823 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped target Network.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped target Remote Encrypted Volumes.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped target Timer Units.
Oct 10 05:05:00 np0005479823 systemd[1]: dbus.socket: Deactivated successfully.
Oct 10 05:05:00 np0005479823 systemd[1]: Closed D-Bus System Message Bus Socket.
Oct 10 05:05:00 np0005479823 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped target Initrd Default Target.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped target Basic System.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped target Initrd Root Device.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped target Initrd /usr File System.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped target Path Units.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped target Remote File Systems.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped target Preparation for Remote File Systems.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped target Slice Units.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped target Socket Units.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped target System Initialization.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped target Local File Systems.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped target Swaps.
Oct 10 05:05:00 np0005479823 systemd[1]: dracut-mount.service: Deactivated successfully.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped dracut mount hook.
Oct 10 05:05:00 np0005479823 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped dracut pre-mount hook.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped target Local Encrypted Volumes.
Oct 10 05:05:00 np0005479823 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Oct 10 05:05:00 np0005479823 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped dracut initqueue hook.
Oct 10 05:05:00 np0005479823 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped Apply Kernel Variables.
Oct 10 05:05:00 np0005479823 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped Create Volatile Files and Directories.
Oct 10 05:05:00 np0005479823 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped Coldplug All udev Devices.
Oct 10 05:05:00 np0005479823 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped dracut pre-trigger hook.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Oct 10 05:05:00 np0005479823 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped Setup Virtual Console.
Oct 10 05:05:00 np0005479823 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Oct 10 05:05:00 np0005479823 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Oct 10 05:05:00 np0005479823 systemd[1]: systemd-udevd.service: Deactivated successfully.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Oct 10 05:05:00 np0005479823 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Oct 10 05:05:00 np0005479823 systemd[1]: Closed udev Control Socket.
Oct 10 05:05:00 np0005479823 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Oct 10 05:05:00 np0005479823 systemd[1]: Closed udev Kernel Socket.
Oct 10 05:05:00 np0005479823 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped dracut pre-udev hook.
Oct 10 05:05:00 np0005479823 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped dracut cmdline hook.
Oct 10 05:05:00 np0005479823 systemd[1]: Starting Cleanup udev Database...
Oct 10 05:05:00 np0005479823 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped Create Static Device Nodes in /dev.
Oct 10 05:05:00 np0005479823 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped Create List of Static Device Nodes.
Oct 10 05:05:00 np0005479823 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Oct 10 05:05:00 np0005479823 systemd[1]: Stopped Create System Users.
Oct 10 05:05:00 np0005479823 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Oct 10 05:05:00 np0005479823 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Oct 10 05:05:00 np0005479823 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Oct 10 05:05:00 np0005479823 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct 10 05:05:00 np0005479823 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Oct 10 05:05:00 np0005479823 systemd[1]: Finished Cleanup udev Database.
Oct 10 05:05:00 np0005479823 systemd[1]: Reached target Switch Root.
Oct 10 05:05:00 np0005479823 systemd[1]: Starting Switch Root...
Oct 10 05:05:00 np0005479823 systemd[1]: Switching root.
Oct 10 05:05:00 np0005479823 systemd-journald[305]: Journal stopped
Oct 10 05:05:01 np0005479823 systemd-journald: Received SIGTERM from PID 1 (systemd).
Oct 10 05:05:01 np0005479823 kernel: audit: type=1404 audit(1760087100.288:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Oct 10 05:05:01 np0005479823 kernel: SELinux:  policy capability network_peer_controls=1
Oct 10 05:05:01 np0005479823 kernel: SELinux:  policy capability open_perms=1
Oct 10 05:05:01 np0005479823 kernel: SELinux:  policy capability extended_socket_class=1
Oct 10 05:05:01 np0005479823 kernel: SELinux:  policy capability always_check_network=0
Oct 10 05:05:01 np0005479823 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 10 05:05:01 np0005479823 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 10 05:05:01 np0005479823 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 10 05:05:01 np0005479823 kernel: audit: type=1403 audit(1760087100.425:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Oct 10 05:05:01 np0005479823 systemd: Successfully loaded SELinux policy in 142.114ms.
Oct 10 05:05:01 np0005479823 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 30.379ms.
Oct 10 05:05:01 np0005479823 systemd: systemd 252-57.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct 10 05:05:01 np0005479823 systemd: Detected virtualization kvm.
Oct 10 05:05:01 np0005479823 systemd: Detected architecture x86-64.
Oct 10 05:05:01 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:05:01 np0005479823 systemd: initrd-switch-root.service: Deactivated successfully.
Oct 10 05:05:01 np0005479823 systemd: Stopped Switch Root.
Oct 10 05:05:01 np0005479823 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Oct 10 05:05:01 np0005479823 systemd: Created slice Slice /system/getty.
Oct 10 05:05:01 np0005479823 systemd: Created slice Slice /system/serial-getty.
Oct 10 05:05:01 np0005479823 systemd: Created slice Slice /system/sshd-keygen.
Oct 10 05:05:01 np0005479823 systemd: Created slice User and Session Slice.
Oct 10 05:05:01 np0005479823 systemd: Started Dispatch Password Requests to Console Directory Watch.
Oct 10 05:05:01 np0005479823 systemd: Started Forward Password Requests to Wall Directory Watch.
Oct 10 05:05:01 np0005479823 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Oct 10 05:05:01 np0005479823 systemd: Reached target Local Encrypted Volumes.
Oct 10 05:05:01 np0005479823 systemd: Stopped target Switch Root.
Oct 10 05:05:01 np0005479823 systemd: Stopped target Initrd File Systems.
Oct 10 05:05:01 np0005479823 systemd: Stopped target Initrd Root File System.
Oct 10 05:05:01 np0005479823 systemd: Reached target Local Integrity Protected Volumes.
Oct 10 05:05:01 np0005479823 systemd: Reached target Path Units.
Oct 10 05:05:01 np0005479823 systemd: Reached target rpc_pipefs.target.
Oct 10 05:05:01 np0005479823 systemd: Reached target Slice Units.
Oct 10 05:05:01 np0005479823 systemd: Reached target Swaps.
Oct 10 05:05:01 np0005479823 systemd: Reached target Local Verity Protected Volumes.
Oct 10 05:05:01 np0005479823 systemd: Listening on RPCbind Server Activation Socket.
Oct 10 05:05:01 np0005479823 systemd: Reached target RPC Port Mapper.
Oct 10 05:05:01 np0005479823 systemd: Listening on Process Core Dump Socket.
Oct 10 05:05:01 np0005479823 systemd: Listening on initctl Compatibility Named Pipe.
Oct 10 05:05:01 np0005479823 systemd: Listening on udev Control Socket.
Oct 10 05:05:01 np0005479823 systemd: Listening on udev Kernel Socket.
Oct 10 05:05:01 np0005479823 systemd: Mounting Huge Pages File System...
Oct 10 05:05:01 np0005479823 systemd: Mounting POSIX Message Queue File System...
Oct 10 05:05:01 np0005479823 systemd: Mounting Kernel Debug File System...
Oct 10 05:05:01 np0005479823 systemd: Mounting Kernel Trace File System...
Oct 10 05:05:01 np0005479823 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct 10 05:05:01 np0005479823 systemd: Starting Create List of Static Device Nodes...
Oct 10 05:05:01 np0005479823 systemd: Starting Load Kernel Module configfs...
Oct 10 05:05:01 np0005479823 systemd: Starting Load Kernel Module drm...
Oct 10 05:05:01 np0005479823 systemd: Starting Load Kernel Module efi_pstore...
Oct 10 05:05:01 np0005479823 systemd: Starting Load Kernel Module fuse...
Oct 10 05:05:01 np0005479823 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Oct 10 05:05:01 np0005479823 systemd: systemd-fsck-root.service: Deactivated successfully.
Oct 10 05:05:01 np0005479823 systemd: Stopped File System Check on Root Device.
Oct 10 05:05:01 np0005479823 systemd: Stopped Journal Service.
Oct 10 05:05:01 np0005479823 systemd: Starting Journal Service...
Oct 10 05:05:01 np0005479823 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Oct 10 05:05:01 np0005479823 systemd: Starting Generate network units from Kernel command line...
Oct 10 05:05:01 np0005479823 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct 10 05:05:01 np0005479823 systemd: Starting Remount Root and Kernel File Systems...
Oct 10 05:05:01 np0005479823 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Oct 10 05:05:01 np0005479823 systemd: Starting Apply Kernel Variables...
Oct 10 05:05:01 np0005479823 systemd: Starting Coldplug All udev Devices...
Oct 10 05:05:01 np0005479823 kernel: fuse: init (API version 7.37)
Oct 10 05:05:01 np0005479823 systemd-journald[678]: Journal started
Oct 10 05:05:01 np0005479823 systemd-journald[678]: Runtime Journal (/run/log/journal/a1727ec20198bc6caf436a6e13c4ff5e) is 8.0M, max 153.6M, 145.6M free.
Oct 10 05:05:01 np0005479823 systemd[1]: Queued start job for default target Multi-User System.
Oct 10 05:05:01 np0005479823 systemd[1]: systemd-journald.service: Deactivated successfully.
Oct 10 05:05:01 np0005479823 systemd: Mounted Huge Pages File System.
Oct 10 05:05:01 np0005479823 systemd: Started Journal Service.
Oct 10 05:05:01 np0005479823 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Oct 10 05:05:01 np0005479823 systemd[1]: Mounted POSIX Message Queue File System.
Oct 10 05:05:01 np0005479823 systemd[1]: Mounted Kernel Debug File System.
Oct 10 05:05:01 np0005479823 systemd[1]: Mounted Kernel Trace File System.
Oct 10 05:05:01 np0005479823 systemd[1]: Finished Create List of Static Device Nodes.
Oct 10 05:05:01 np0005479823 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct 10 05:05:01 np0005479823 systemd[1]: Finished Load Kernel Module configfs.
Oct 10 05:05:01 np0005479823 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Oct 10 05:05:01 np0005479823 systemd[1]: Finished Load Kernel Module efi_pstore.
Oct 10 05:05:01 np0005479823 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Oct 10 05:05:01 np0005479823 systemd[1]: Finished Load Kernel Module fuse.
Oct 10 05:05:01 np0005479823 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Oct 10 05:05:01 np0005479823 systemd[1]: Finished Generate network units from Kernel command line.
Oct 10 05:05:01 np0005479823 systemd[1]: Finished Remount Root and Kernel File Systems.
Oct 10 05:05:01 np0005479823 systemd[1]: Mounting FUSE Control File System...
Oct 10 05:05:01 np0005479823 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct 10 05:05:01 np0005479823 kernel: ACPI: bus type drm_connector registered
Oct 10 05:05:01 np0005479823 systemd[1]: Starting Rebuild Hardware Database...
Oct 10 05:05:01 np0005479823 systemd[1]: Starting Flush Journal to Persistent Storage...
Oct 10 05:05:01 np0005479823 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Oct 10 05:05:01 np0005479823 systemd[1]: Starting Load/Save OS Random Seed...
Oct 10 05:05:01 np0005479823 systemd[1]: Starting Create System Users...
Oct 10 05:05:01 np0005479823 systemd[1]: modprobe@drm.service: Deactivated successfully.
Oct 10 05:05:01 np0005479823 systemd[1]: Finished Load Kernel Module drm.
Oct 10 05:05:01 np0005479823 systemd[1]: Finished Apply Kernel Variables.
Oct 10 05:05:01 np0005479823 systemd-journald[678]: Runtime Journal (/run/log/journal/a1727ec20198bc6caf436a6e13c4ff5e) is 8.0M, max 153.6M, 145.6M free.
Oct 10 05:05:01 np0005479823 systemd-journald[678]: Received client request to flush runtime journal.
Oct 10 05:05:01 np0005479823 systemd[1]: Mounted FUSE Control File System.
Oct 10 05:05:01 np0005479823 systemd[1]: Finished Flush Journal to Persistent Storage.
Oct 10 05:05:01 np0005479823 systemd[1]: Finished Load/Save OS Random Seed.
Oct 10 05:05:01 np0005479823 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct 10 05:05:01 np0005479823 systemd[1]: Finished Create System Users.
Oct 10 05:05:01 np0005479823 systemd[1]: Starting Create Static Device Nodes in /dev...
Oct 10 05:05:01 np0005479823 systemd[1]: Finished Coldplug All udev Devices.
Oct 10 05:05:01 np0005479823 systemd[1]: Finished Create Static Device Nodes in /dev.
Oct 10 05:05:01 np0005479823 systemd[1]: Reached target Preparation for Local File Systems.
Oct 10 05:05:01 np0005479823 systemd[1]: Reached target Local File Systems.
Oct 10 05:05:01 np0005479823 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Oct 10 05:05:01 np0005479823 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Oct 10 05:05:01 np0005479823 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Oct 10 05:05:01 np0005479823 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Oct 10 05:05:01 np0005479823 systemd[1]: Starting Automatic Boot Loader Update...
Oct 10 05:05:01 np0005479823 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Oct 10 05:05:01 np0005479823 systemd[1]: Starting Create Volatile Files and Directories...
Oct 10 05:05:01 np0005479823 bootctl[696]: Couldn't find EFI system partition, skipping.
Oct 10 05:05:01 np0005479823 systemd[1]: Finished Automatic Boot Loader Update.
Oct 10 05:05:01 np0005479823 systemd[1]: Finished Create Volatile Files and Directories.
Oct 10 05:05:01 np0005479823 systemd[1]: Starting Security Auditing Service...
Oct 10 05:05:01 np0005479823 systemd[1]: Starting RPC Bind...
Oct 10 05:05:01 np0005479823 systemd[1]: Starting Rebuild Journal Catalog...
Oct 10 05:05:01 np0005479823 auditd[702]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Oct 10 05:05:01 np0005479823 auditd[702]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Oct 10 05:05:01 np0005479823 systemd[1]: Finished Rebuild Journal Catalog.
Oct 10 05:05:01 np0005479823 systemd[1]: Started RPC Bind.
Oct 10 05:05:01 np0005479823 augenrules[707]: /sbin/augenrules: No change
Oct 10 05:05:01 np0005479823 augenrules[722]: No rules
Oct 10 05:05:01 np0005479823 augenrules[722]: enabled 1
Oct 10 05:05:01 np0005479823 augenrules[722]: failure 1
Oct 10 05:05:01 np0005479823 augenrules[722]: pid 702
Oct 10 05:05:01 np0005479823 augenrules[722]: rate_limit 0
Oct 10 05:05:01 np0005479823 augenrules[722]: backlog_limit 8192
Oct 10 05:05:01 np0005479823 augenrules[722]: lost 0
Oct 10 05:05:01 np0005479823 augenrules[722]: backlog 1
Oct 10 05:05:01 np0005479823 augenrules[722]: backlog_wait_time 60000
Oct 10 05:05:01 np0005479823 augenrules[722]: backlog_wait_time_actual 0
Oct 10 05:05:01 np0005479823 augenrules[722]: enabled 1
Oct 10 05:05:01 np0005479823 augenrules[722]: failure 1
Oct 10 05:05:01 np0005479823 augenrules[722]: pid 702
Oct 10 05:05:01 np0005479823 augenrules[722]: rate_limit 0
Oct 10 05:05:01 np0005479823 augenrules[722]: backlog_limit 8192
Oct 10 05:05:01 np0005479823 augenrules[722]: lost 0
Oct 10 05:05:01 np0005479823 augenrules[722]: backlog 2
Oct 10 05:05:01 np0005479823 augenrules[722]: backlog_wait_time 60000
Oct 10 05:05:01 np0005479823 augenrules[722]: backlog_wait_time_actual 0
Oct 10 05:05:01 np0005479823 augenrules[722]: enabled 1
Oct 10 05:05:01 np0005479823 augenrules[722]: failure 1
Oct 10 05:05:01 np0005479823 augenrules[722]: pid 702
Oct 10 05:05:01 np0005479823 augenrules[722]: rate_limit 0
Oct 10 05:05:01 np0005479823 augenrules[722]: backlog_limit 8192
Oct 10 05:05:01 np0005479823 augenrules[722]: lost 0
Oct 10 05:05:01 np0005479823 augenrules[722]: backlog 4
Oct 10 05:05:01 np0005479823 augenrules[722]: backlog_wait_time 60000
Oct 10 05:05:01 np0005479823 augenrules[722]: backlog_wait_time_actual 0
Oct 10 05:05:01 np0005479823 systemd[1]: Started Security Auditing Service.
Oct 10 05:05:01 np0005479823 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Oct 10 05:05:01 np0005479823 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Oct 10 05:05:01 np0005479823 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Oct 10 05:05:01 np0005479823 systemd[1]: Finished Rebuild Hardware Database.
Oct 10 05:05:01 np0005479823 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct 10 05:05:01 np0005479823 systemd[1]: Starting Update is Completed...
Oct 10 05:05:01 np0005479823 systemd[1]: Finished Update is Completed.
Oct 10 05:05:01 np0005479823 systemd-udevd[730]: Using default interface naming scheme 'rhel-9.0'.
Oct 10 05:05:01 np0005479823 systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct 10 05:05:01 np0005479823 systemd[1]: Reached target System Initialization.
Oct 10 05:05:01 np0005479823 systemd[1]: Started dnf makecache --timer.
Oct 10 05:05:01 np0005479823 systemd[1]: Started Daily rotation of log files.
Oct 10 05:05:01 np0005479823 systemd[1]: Started Daily Cleanup of Temporary Directories.
Oct 10 05:05:01 np0005479823 systemd[1]: Reached target Timer Units.
Oct 10 05:05:01 np0005479823 systemd[1]: Listening on D-Bus System Message Bus Socket.
Oct 10 05:05:01 np0005479823 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Oct 10 05:05:01 np0005479823 systemd[1]: Reached target Socket Units.
Oct 10 05:05:01 np0005479823 systemd[1]: Starting D-Bus System Message Bus...
Oct 10 05:05:01 np0005479823 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct 10 05:05:01 np0005479823 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Oct 10 05:05:01 np0005479823 systemd[1]: Starting Load Kernel Module configfs...
Oct 10 05:05:01 np0005479823 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct 10 05:05:01 np0005479823 systemd[1]: Finished Load Kernel Module configfs.
Oct 10 05:05:01 np0005479823 systemd-udevd[740]: Network interface NamePolicy= disabled on kernel command line.
Oct 10 05:05:01 np0005479823 systemd[1]: Started D-Bus System Message Bus.
Oct 10 05:05:01 np0005479823 systemd[1]: Reached target Basic System.
Oct 10 05:05:01 np0005479823 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Oct 10 05:05:01 np0005479823 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Oct 10 05:05:01 np0005479823 dbus-broker-lau[753]: Ready
Oct 10 05:05:01 np0005479823 systemd[1]: Starting NTP client/server...
Oct 10 05:05:01 np0005479823 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Oct 10 05:05:02 np0005479823 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Oct 10 05:05:02 np0005479823 chronyd[785]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct 10 05:05:02 np0005479823 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Oct 10 05:05:02 np0005479823 chronyd[785]: Loaded 0 symmetric keys
Oct 10 05:05:02 np0005479823 systemd[1]: Starting Restore /run/initramfs on shutdown...
Oct 10 05:05:02 np0005479823 chronyd[785]: Using right/UTC timezone to obtain leap second data
Oct 10 05:05:02 np0005479823 chronyd[785]: Loaded seccomp filter (level 2)
Oct 10 05:05:02 np0005479823 systemd[1]: Starting IPv4 firewall with iptables...
Oct 10 05:05:02 np0005479823 systemd[1]: Started irqbalance daemon.
Oct 10 05:05:02 np0005479823 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Oct 10 05:05:02 np0005479823 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 10 05:05:02 np0005479823 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 10 05:05:02 np0005479823 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 10 05:05:02 np0005479823 systemd[1]: Reached target sshd-keygen.target.
Oct 10 05:05:02 np0005479823 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Oct 10 05:05:02 np0005479823 systemd[1]: Reached target User and Group Name Lookups.
Oct 10 05:05:02 np0005479823 systemd[1]: Starting User Login Management...
Oct 10 05:05:02 np0005479823 systemd[1]: Started NTP client/server.
Oct 10 05:05:02 np0005479823 systemd[1]: Finished Restore /run/initramfs on shutdown.
Oct 10 05:05:02 np0005479823 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Oct 10 05:05:02 np0005479823 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Oct 10 05:05:02 np0005479823 systemd-logind[796]: Watching system buttons on /dev/input/event0 (Power Button)
Oct 10 05:05:02 np0005479823 systemd-logind[796]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct 10 05:05:02 np0005479823 kernel: kvm_amd: TSC scaling supported
Oct 10 05:05:02 np0005479823 kernel: kvm_amd: Nested Virtualization enabled
Oct 10 05:05:02 np0005479823 kernel: kvm_amd: Nested Paging enabled
Oct 10 05:05:02 np0005479823 kernel: kvm_amd: LBR virtualization supported
Oct 10 05:05:02 np0005479823 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Oct 10 05:05:02 np0005479823 kernel: Console: switching to colour dummy device 80x25
Oct 10 05:05:02 np0005479823 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Oct 10 05:05:02 np0005479823 kernel: [drm] features: -context_init
Oct 10 05:05:02 np0005479823 systemd-logind[796]: New seat seat0.
Oct 10 05:05:02 np0005479823 systemd[1]: Started User Login Management.
Oct 10 05:05:02 np0005479823 kernel: [drm] number of scanouts: 1
Oct 10 05:05:02 np0005479823 kernel: [drm] number of cap sets: 0
Oct 10 05:05:02 np0005479823 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Oct 10 05:05:02 np0005479823 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Oct 10 05:05:02 np0005479823 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Oct 10 05:05:02 np0005479823 kernel: Console: switching to colour frame buffer device 128x48
Oct 10 05:05:02 np0005479823 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Oct 10 05:05:02 np0005479823 iptables.init[788]: iptables: Applying firewall rules: [  OK  ]
Oct 10 05:05:02 np0005479823 systemd[1]: Finished IPv4 firewall with iptables.
Oct 10 05:05:02 np0005479823 cloud-init[838]: Cloud-init v. 24.4-7.el9 running 'init-local' at Fri, 10 Oct 2025 09:05:02 +0000. Up 6.49 seconds.
Oct 10 05:05:03 np0005479823 systemd[1]: run-cloud\x2dinit-tmp-tmpzrdvt0mo.mount: Deactivated successfully.
Oct 10 05:05:03 np0005479823 systemd[1]: Starting Hostname Service...
Oct 10 05:05:03 np0005479823 systemd[1]: Started Hostname Service.
Oct 10 05:05:03 np0005479823 systemd-hostnamed[852]: Hostname set to <np0005479823.novalocal> (static)
Oct 10 05:05:03 np0005479823 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Oct 10 05:05:03 np0005479823 systemd[1]: Reached target Preparation for Network.
Oct 10 05:05:03 np0005479823 systemd[1]: Starting Network Manager...
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.5855] NetworkManager (version 1.54.1-1.el9) is starting... (boot:d2fa8de7-cb1e-4362-bed6-d8a2357f049b)
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.5861] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6040] manager[0x5624a3fad080]: monitoring kernel firmware directory '/lib/firmware'.
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6099] hostname: hostname: using hostnamed
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6101] hostname: static hostname changed from (none) to "np0005479823.novalocal"
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6107] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6234] manager[0x5624a3fad080]: rfkill: Wi-Fi hardware radio set enabled
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6235] manager[0x5624a3fad080]: rfkill: WWAN hardware radio set enabled
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6304] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6305] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6305] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6306] manager: Networking is enabled by state file
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6308] settings: Loaded settings plugin: keyfile (internal)
Oct 10 05:05:03 np0005479823 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6346] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6379] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6403] dhcp: init: Using DHCP client 'internal'
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6405] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6416] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6426] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6433] device (lo): Activation: starting connection 'lo' (b2f4c0ce-6660-4aa4-ac06-17229f19cc05)
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6442] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6445] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6472] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6476] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6478] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6479] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6481] device (eth0): carrier: link connected
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6483] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6487] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6491] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6494] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6494] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6496] manager: NetworkManager state is now CONNECTING
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6496] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6501] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6503] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6544] dhcp4 (eth0): state changed new lease, address=38.102.83.22
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6550] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct 10 05:05:03 np0005479823 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6565] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 05:05:03 np0005479823 systemd[1]: Started Network Manager.
Oct 10 05:05:03 np0005479823 systemd[1]: Reached target Network.
Oct 10 05:05:03 np0005479823 systemd[1]: Starting Network Manager Wait Online...
Oct 10 05:05:03 np0005479823 systemd[1]: Starting GSSAPI Proxy Daemon...
Oct 10 05:05:03 np0005479823 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6805] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6807] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6813] device (lo): Activation: successful, device activated.
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6818] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6819] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6821] manager: NetworkManager state is now CONNECTED_SITE
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6823] device (eth0): Activation: successful, device activated.
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6827] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct 10 05:05:03 np0005479823 NetworkManager[856]: <info>  [1760087103.6829] manager: startup complete
Oct 10 05:05:03 np0005479823 systemd[1]: Finished Network Manager Wait Online.
Oct 10 05:05:03 np0005479823 systemd[1]: Starting Cloud-init: Network Stage...
Oct 10 05:05:03 np0005479823 systemd[1]: Started GSSAPI Proxy Daemon.
Oct 10 05:05:03 np0005479823 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct 10 05:05:03 np0005479823 systemd[1]: Reached target NFS client services.
Oct 10 05:05:03 np0005479823 systemd[1]: Reached target Preparation for Remote File Systems.
Oct 10 05:05:03 np0005479823 systemd[1]: Reached target Remote File Systems.
Oct 10 05:05:03 np0005479823 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct 10 05:05:04 np0005479823 cloud-init[919]: Cloud-init v. 24.4-7.el9 running 'init' at Fri, 10 Oct 2025 09:05:04 +0000. Up 7.61 seconds.
Oct 10 05:05:04 np0005479823 cloud-init[919]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Oct 10 05:05:04 np0005479823 cloud-init[919]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct 10 05:05:04 np0005479823 cloud-init[919]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Oct 10 05:05:04 np0005479823 cloud-init[919]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct 10 05:05:04 np0005479823 cloud-init[919]: ci-info: |  eth0  | True |         38.102.83.22         | 255.255.255.0 | global | fa:16:3e:a7:e1:7f |
Oct 10 05:05:04 np0005479823 cloud-init[919]: ci-info: |  eth0  | True | fe80::f816:3eff:fea7:e17f/64 |       .       |  link  | fa:16:3e:a7:e1:7f |
Oct 10 05:05:04 np0005479823 cloud-init[919]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Oct 10 05:05:04 np0005479823 cloud-init[919]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Oct 10 05:05:04 np0005479823 cloud-init[919]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct 10 05:05:04 np0005479823 cloud-init[919]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Oct 10 05:05:04 np0005479823 cloud-init[919]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct 10 05:05:04 np0005479823 cloud-init[919]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Oct 10 05:05:04 np0005479823 cloud-init[919]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct 10 05:05:04 np0005479823 cloud-init[919]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Oct 10 05:05:04 np0005479823 cloud-init[919]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Oct 10 05:05:04 np0005479823 cloud-init[919]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Oct 10 05:05:04 np0005479823 cloud-init[919]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct 10 05:05:04 np0005479823 cloud-init[919]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Oct 10 05:05:04 np0005479823 cloud-init[919]: ci-info: +-------+-------------+---------+-----------+-------+
Oct 10 05:05:04 np0005479823 cloud-init[919]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Oct 10 05:05:04 np0005479823 cloud-init[919]: ci-info: +-------+-------------+---------+-----------+-------+
Oct 10 05:05:04 np0005479823 cloud-init[919]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Oct 10 05:05:04 np0005479823 cloud-init[919]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Oct 10 05:05:04 np0005479823 cloud-init[919]: ci-info: +-------+-------------+---------+-----------+-------+
Oct 10 05:05:05 np0005479823 cloud-init[919]: Generating public/private rsa key pair.
Oct 10 05:05:05 np0005479823 cloud-init[919]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Oct 10 05:05:05 np0005479823 cloud-init[919]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Oct 10 05:05:05 np0005479823 cloud-init[919]: The key fingerprint is:
Oct 10 05:05:05 np0005479823 cloud-init[919]: SHA256:t/SQhZU1w5Y/d91ueZ7uPOfeowunYPDSSoyEHlJ//70 root@np0005479823.novalocal
Oct 10 05:05:05 np0005479823 cloud-init[919]: The key's randomart image is:
Oct 10 05:05:05 np0005479823 cloud-init[919]: +---[RSA 3072]----+
Oct 10 05:05:05 np0005479823 cloud-init[919]: |            .++. |
Oct 10 05:05:05 np0005479823 cloud-init[919]: |           o. +o |
Oct 10 05:05:05 np0005479823 cloud-init[919]: |   .      . .. .o|
Oct 10 05:05:05 np0005479823 cloud-init[919]: |  . o      o   .*|
Oct 10 05:05:05 np0005479823 cloud-init[919]: | . o o oS =    .=|
Oct 10 05:05:05 np0005479823 cloud-init[919]: |  o o + =o +   .+|
Oct 10 05:05:05 np0005479823 cloud-init[919]: |   . . + *.... oo|
Oct 10 05:05:05 np0005479823 cloud-init[919]: |      . + o = .++|
Oct 10 05:05:05 np0005479823 cloud-init[919]: |       .   o E**B|
Oct 10 05:05:05 np0005479823 cloud-init[919]: +----[SHA256]-----+
Oct 10 05:05:05 np0005479823 cloud-init[919]: Generating public/private ecdsa key pair.
Oct 10 05:05:05 np0005479823 cloud-init[919]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Oct 10 05:05:05 np0005479823 cloud-init[919]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Oct 10 05:05:05 np0005479823 cloud-init[919]: The key fingerprint is:
Oct 10 05:05:05 np0005479823 cloud-init[919]: SHA256:99zX2Y02Xg+OcLpeocHbNt/2VxdcdEc959JD/kka7qU root@np0005479823.novalocal
Oct 10 05:05:05 np0005479823 cloud-init[919]: The key's randomart image is:
Oct 10 05:05:05 np0005479823 cloud-init[919]: +---[ECDSA 256]---+
Oct 10 05:05:05 np0005479823 cloud-init[919]: |               o*|
Oct 10 05:05:05 np0005479823 cloud-init[919]: |               o*|
Oct 10 05:05:05 np0005479823 cloud-init[919]: |              +o+|
Oct 10 05:05:05 np0005479823 cloud-init[919]: |         .   ..*o|
Oct 10 05:05:05 np0005479823 cloud-init[919]: |        S + o +.=|
Oct 10 05:05:05 np0005479823 cloud-init[919]: |         . B = +O|
Oct 10 05:05:05 np0005479823 cloud-init[919]: |          + X O.O|
Oct 10 05:05:05 np0005479823 cloud-init[919]: |           * E *+|
Oct 10 05:05:05 np0005479823 cloud-init[919]: |         .+.. +.*|
Oct 10 05:05:05 np0005479823 cloud-init[919]: +----[SHA256]-----+
Oct 10 05:05:05 np0005479823 cloud-init[919]: Generating public/private ed25519 key pair.
Oct 10 05:05:05 np0005479823 cloud-init[919]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Oct 10 05:05:05 np0005479823 cloud-init[919]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Oct 10 05:05:05 np0005479823 cloud-init[919]: The key fingerprint is:
Oct 10 05:05:05 np0005479823 cloud-init[919]: SHA256:An2o15wB8EXt32HDzImmdKZQcdQjbwVK+IdUItPhxRE root@np0005479823.novalocal
Oct 10 05:05:05 np0005479823 cloud-init[919]: The key's randomart image is:
Oct 10 05:05:05 np0005479823 cloud-init[919]: +--[ED25519 256]--+
Oct 10 05:05:05 np0005479823 cloud-init[919]: |    ....o.+=*=E+ |
Oct 10 05:05:05 np0005479823 cloud-init[919]: |     o +  +*+++ .|
Oct 10 05:05:05 np0005479823 cloud-init[919]: |    . + oo ooO + |
Oct 10 05:05:05 np0005479823 cloud-init[919]: |     o +.oo B %  |
Oct 10 05:05:05 np0005479823 cloud-init[919]: |    . o So B = o |
Oct 10 05:05:05 np0005479823 cloud-init[919]: |     . .  o . .  |
Oct 10 05:05:05 np0005479823 cloud-init[919]: |                 |
Oct 10 05:05:05 np0005479823 cloud-init[919]: |                 |
Oct 10 05:05:05 np0005479823 cloud-init[919]: |                 |
Oct 10 05:05:05 np0005479823 cloud-init[919]: +----[SHA256]-----+
Oct 10 05:05:05 np0005479823 systemd[1]: Finished Cloud-init: Network Stage.
Oct 10 05:05:05 np0005479823 systemd[1]: Reached target Cloud-config availability.
Oct 10 05:05:05 np0005479823 systemd[1]: Reached target Network is Online.
Oct 10 05:05:05 np0005479823 systemd[1]: Starting Cloud-init: Config Stage...
Oct 10 05:05:05 np0005479823 systemd[1]: Starting Notify NFS peers of a restart...
Oct 10 05:05:05 np0005479823 systemd[1]: Starting System Logging Service...
Oct 10 05:05:05 np0005479823 sm-notify[1000]: Version 2.5.4 starting
Oct 10 05:05:05 np0005479823 systemd[1]: Starting OpenSSH server daemon...
Oct 10 05:05:05 np0005479823 systemd[1]: Starting Permit User Sessions...
Oct 10 05:05:05 np0005479823 systemd[1]: Started Notify NFS peers of a restart.
Oct 10 05:05:05 np0005479823 systemd[1]: Started OpenSSH server daemon.
Oct 10 05:05:05 np0005479823 systemd[1]: Finished Permit User Sessions.
Oct 10 05:05:05 np0005479823 systemd[1]: Started Command Scheduler.
Oct 10 05:05:05 np0005479823 systemd[1]: Started Getty on tty1.
Oct 10 05:05:05 np0005479823 systemd[1]: Started Serial Getty on ttyS0.
Oct 10 05:05:05 np0005479823 systemd[1]: Reached target Login Prompts.
Oct 10 05:05:05 np0005479823 rsyslogd[1001]: [origin software="rsyslogd" swVersion="8.2506.0-2.el9" x-pid="1001" x-info="https://www.rsyslog.com"] start
Oct 10 05:05:05 np0005479823 systemd[1]: Started System Logging Service.
Oct 10 05:05:05 np0005479823 rsyslogd[1001]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Oct 10 05:05:05 np0005479823 systemd[1]: Reached target Multi-User System.
Oct 10 05:05:05 np0005479823 systemd[1]: Starting Record Runlevel Change in UTMP...
Oct 10 05:05:05 np0005479823 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Oct 10 05:05:05 np0005479823 systemd[1]: Finished Record Runlevel Change in UTMP.
Oct 10 05:05:06 np0005479823 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 05:05:06 np0005479823 cloud-init[1014]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Fri, 10 Oct 2025 09:05:06 +0000. Up 9.76 seconds.
Oct 10 05:05:06 np0005479823 systemd[1]: Finished Cloud-init: Config Stage.
Oct 10 05:05:06 np0005479823 systemd[1]: Starting Cloud-init: Final Stage...
Oct 10 05:05:06 np0005479823 cloud-init[1018]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Fri, 10 Oct 2025 09:05:06 +0000. Up 10.15 seconds.
Oct 10 05:05:06 np0005479823 cloud-init[1021]: #############################################################
Oct 10 05:05:06 np0005479823 cloud-init[1023]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Oct 10 05:05:06 np0005479823 cloud-init[1026]: 256 SHA256:99zX2Y02Xg+OcLpeocHbNt/2VxdcdEc959JD/kka7qU root@np0005479823.novalocal (ECDSA)
Oct 10 05:05:06 np0005479823 cloud-init[1030]: 256 SHA256:An2o15wB8EXt32HDzImmdKZQcdQjbwVK+IdUItPhxRE root@np0005479823.novalocal (ED25519)
Oct 10 05:05:06 np0005479823 cloud-init[1033]: 3072 SHA256:t/SQhZU1w5Y/d91ueZ7uPOfeowunYPDSSoyEHlJ//70 root@np0005479823.novalocal (RSA)
Oct 10 05:05:06 np0005479823 cloud-init[1034]: -----END SSH HOST KEY FINGERPRINTS-----
Oct 10 05:05:06 np0005479823 cloud-init[1036]: #############################################################
Oct 10 05:05:06 np0005479823 cloud-init[1018]: Cloud-init v. 24.4-7.el9 finished at Fri, 10 Oct 2025 09:05:06 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 10.36 seconds
Oct 10 05:05:06 np0005479823 systemd[1]: Finished Cloud-init: Final Stage.
Oct 10 05:05:06 np0005479823 systemd[1]: Reached target Cloud-init target.
Oct 10 05:05:06 np0005479823 systemd[1]: Startup finished in 1.498s (kernel) + 2.394s (initrd) + 6.528s (userspace) = 10.421s.
Oct 10 05:05:08 np0005479823 chronyd[785]: Selected source 45.61.49.156 (2.centos.pool.ntp.org)
Oct 10 05:05:08 np0005479823 chronyd[785]: System clock TAI offset set to 37 seconds
Oct 10 05:05:12 np0005479823 irqbalance[790]: Cannot change IRQ 25 affinity: Operation not permitted
Oct 10 05:05:12 np0005479823 irqbalance[790]: IRQ 25 affinity is now unmanaged
Oct 10 05:05:12 np0005479823 irqbalance[790]: Cannot change IRQ 31 affinity: Operation not permitted
Oct 10 05:05:12 np0005479823 irqbalance[790]: IRQ 31 affinity is now unmanaged
Oct 10 05:05:12 np0005479823 irqbalance[790]: Cannot change IRQ 28 affinity: Operation not permitted
Oct 10 05:05:12 np0005479823 irqbalance[790]: IRQ 28 affinity is now unmanaged
Oct 10 05:05:12 np0005479823 irqbalance[790]: Cannot change IRQ 32 affinity: Operation not permitted
Oct 10 05:05:12 np0005479823 irqbalance[790]: IRQ 32 affinity is now unmanaged
Oct 10 05:05:12 np0005479823 irqbalance[790]: Cannot change IRQ 30 affinity: Operation not permitted
Oct 10 05:05:12 np0005479823 irqbalance[790]: IRQ 30 affinity is now unmanaged
Oct 10 05:05:12 np0005479823 irqbalance[790]: Cannot change IRQ 29 affinity: Operation not permitted
Oct 10 05:05:12 np0005479823 irqbalance[790]: IRQ 29 affinity is now unmanaged
Oct 10 05:05:13 np0005479823 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 10 05:05:19 np0005479823 systemd[1]: Created slice User Slice of UID 1000.
Oct 10 05:05:19 np0005479823 systemd[1]: Starting User Runtime Directory /run/user/1000...
Oct 10 05:05:19 np0005479823 systemd-logind[796]: New session 1 of user zuul.
Oct 10 05:05:19 np0005479823 systemd[1]: Finished User Runtime Directory /run/user/1000.
Oct 10 05:05:19 np0005479823 systemd[1]: Starting User Manager for UID 1000...
Oct 10 05:05:19 np0005479823 systemd[1055]: Queued start job for default target Main User Target.
Oct 10 05:05:19 np0005479823 systemd[1055]: Created slice User Application Slice.
Oct 10 05:05:19 np0005479823 systemd[1055]: Started Mark boot as successful after the user session has run 2 minutes.
Oct 10 05:05:19 np0005479823 systemd[1055]: Started Daily Cleanup of User's Temporary Directories.
Oct 10 05:05:19 np0005479823 systemd[1055]: Reached target Paths.
Oct 10 05:05:19 np0005479823 systemd[1055]: Reached target Timers.
Oct 10 05:05:19 np0005479823 systemd[1055]: Starting D-Bus User Message Bus Socket...
Oct 10 05:05:19 np0005479823 systemd[1055]: Starting Create User's Volatile Files and Directories...
Oct 10 05:05:19 np0005479823 systemd[1055]: Finished Create User's Volatile Files and Directories.
Oct 10 05:05:19 np0005479823 systemd[1055]: Listening on D-Bus User Message Bus Socket.
Oct 10 05:05:19 np0005479823 systemd[1055]: Reached target Sockets.
Oct 10 05:05:19 np0005479823 systemd[1055]: Reached target Basic System.
Oct 10 05:05:19 np0005479823 systemd[1055]: Reached target Main User Target.
Oct 10 05:05:19 np0005479823 systemd[1055]: Startup finished in 127ms.
Oct 10 05:05:19 np0005479823 systemd[1]: Started User Manager for UID 1000.
Oct 10 05:05:19 np0005479823 systemd[1]: Started Session 1 of User zuul.
Oct 10 05:05:20 np0005479823 python3[1138]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:05:25 np0005479823 python3[1166]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:05:32 np0005479823 python3[1224]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:05:33 np0005479823 python3[1264]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Oct 10 05:05:33 np0005479823 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 10 05:05:35 np0005479823 python3[1292]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDDEBkxJ4sw2+DK3cAbafLjRenK6XkRzPrF3EgUC0Qy/9kZ0kuErGkKyCEXRNE93NnKaUfoU9ebcJtP/W0B6xem+P337Yb5eT1d5d0DPlSyJ224O/rNncfiIo6YcMhrWXlb8yWwfHogZqjmOgJoH57cdsVMt26tUmFXzrJ1qEBloCvfoEe/tx8o3aeflIhUQ0zm2bbmhRn09oGRCODyyr02YoJZm5GbMiTb7mz8xvM31PEo8DzS5ti1YMOUi76ojLKIS6hZkIk4sUuSXmOwBoYhmyGjvs8csl/rxfVJq3bV+DFnatOKlFCyjgY0Ed4oCeReEGI6h29najM/8mUzfOeBj0dyWj3N3oOwlewtF5ifTB4JPwfEN1Rx37wbEzN/2Q7MOKzeWDxP2E0trD5ey9oqWFCpRpuJURMiPr+A6h070uR8U8vUNxGtH3vAmkuN+p3w79WF1wzlCmcoC+oSdwETcoOqkD84qkNgYJpVVpboSnwBo/H/aPJuJhs/nYPhz+c= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:05:35 np0005479823 python3[1316]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:05:36 np0005479823 python3[1415]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 05:05:36 np0005479823 python3[1486]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760087136.1893063-254-12633506510624/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=bea29065a9ff49468ede17c902a062ce_id_rsa follow=False checksum=6477c55dd7b29e382b0ff49c34043ebcd2bcc305 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:05:37 np0005479823 python3[1609]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 05:05:37 np0005479823 python3[1680]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760087137.1955144-308-126061657745081/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=bea29065a9ff49468ede17c902a062ce_id_rsa.pub follow=False checksum=8b86d6c8317b3a249fa7c3a90607af8e51a186ef backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:05:39 np0005479823 python3[1728]: ansible-ping Invoked with data=pong
Oct 10 05:05:40 np0005479823 python3[1752]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:05:42 np0005479823 python3[1810]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Oct 10 05:05:43 np0005479823 python3[1842]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:05:44 np0005479823 python3[1866]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:05:44 np0005479823 python3[1890]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:05:44 np0005479823 python3[1914]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:05:44 np0005479823 python3[1938]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:05:45 np0005479823 python3[1962]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:05:47 np0005479823 python3[1988]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:05:47 np0005479823 python3[2066]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 05:05:48 np0005479823 python3[2139]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1760087147.2492192-34-111049401545547/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:05:48 np0005479823 python3[2187]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:05:49 np0005479823 python3[2211]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:05:49 np0005479823 python3[2235]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:05:49 np0005479823 python3[2259]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:05:50 np0005479823 python3[2283]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:05:50 np0005479823 python3[2307]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:05:50 np0005479823 python3[2331]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:05:50 np0005479823 python3[2355]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:05:51 np0005479823 python3[2379]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:05:51 np0005479823 python3[2403]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:05:51 np0005479823 python3[2427]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:05:52 np0005479823 python3[2451]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:05:52 np0005479823 python3[2475]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:05:52 np0005479823 python3[2499]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:05:52 np0005479823 python3[2523]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:05:53 np0005479823 python3[2547]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:05:53 np0005479823 python3[2571]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:05:53 np0005479823 python3[2595]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:05:54 np0005479823 python3[2619]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:05:54 np0005479823 python3[2643]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:05:54 np0005479823 python3[2667]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:05:54 np0005479823 python3[2691]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:05:55 np0005479823 python3[2715]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:05:55 np0005479823 python3[2739]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:05:55 np0005479823 python3[2763]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:05:56 np0005479823 python3[2787]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:05:58 np0005479823 python3[2813]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct 10 05:05:58 np0005479823 systemd[1]: Starting Time & Date Service...
Oct 10 05:05:58 np0005479823 systemd[1]: Started Time & Date Service.
Oct 10 05:05:58 np0005479823 systemd-timedated[2815]: Changed time zone to 'UTC' (UTC).
Oct 10 05:05:58 np0005479823 python3[2844]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:05:59 np0005479823 python3[2920]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 05:05:59 np0005479823 python3[2991]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1760087159.1882753-254-206617524013399/source _original_basename=tmpzodar4gc follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:06:00 np0005479823 python3[3091]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 05:06:00 np0005479823 python3[3162]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1760087160.0538177-304-208790470661808/source _original_basename=tmpc6zltfz1 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:06:01 np0005479823 python3[3264]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 05:06:02 np0005479823 python3[3337]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1760087161.3989956-384-198295954028806/source _original_basename=tmpsfb2cfi9 follow=False checksum=0a5264336eaf669ce906803fabc64043ef3757da backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:06:02 np0005479823 python3[3385]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:06:03 np0005479823 python3[3411]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:06:03 np0005479823 python3[3491]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 05:06:03 np0005479823 python3[3564]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1760087163.2109654-454-274442428240427/source _original_basename=tmpg4pk5syj follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:06:04 np0005479823 python3[3615]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163efc-24cc-80e1-2ccb-00000000001f-1-compute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:06:05 np0005479823 python3[3643]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163efc-24cc-80e1-2ccb-000000000020-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Oct 10 05:06:06 np0005479823 python3[3671]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:06:24 np0005479823 python3[3697]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:06:28 np0005479823 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct 10 05:07:24 np0005479823 systemd-logind[796]: Session 1 logged out. Waiting for processes to exit.
Oct 10 05:07:52 np0005479823 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct 10 05:07:52 np0005479823 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Oct 10 05:07:52 np0005479823 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Oct 10 05:07:52 np0005479823 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Oct 10 05:07:52 np0005479823 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Oct 10 05:07:52 np0005479823 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Oct 10 05:07:52 np0005479823 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Oct 10 05:07:52 np0005479823 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Oct 10 05:07:52 np0005479823 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Oct 10 05:07:52 np0005479823 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Oct 10 05:07:52 np0005479823 NetworkManager[856]: <info>  [1760087272.5440] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct 10 05:07:52 np0005479823 systemd-udevd[3700]: Network interface NamePolicy= disabled on kernel command line.
Oct 10 05:07:52 np0005479823 NetworkManager[856]: <info>  [1760087272.5639] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 05:07:52 np0005479823 NetworkManager[856]: <info>  [1760087272.5667] settings: (eth1): created default wired connection 'Wired connection 1'
Oct 10 05:07:52 np0005479823 NetworkManager[856]: <info>  [1760087272.5670] device (eth1): carrier: link connected
Oct 10 05:07:52 np0005479823 NetworkManager[856]: <info>  [1760087272.5671] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct 10 05:07:52 np0005479823 NetworkManager[856]: <info>  [1760087272.5675] policy: auto-activating connection 'Wired connection 1' (9070ab9c-fab6-3aab-b68b-48035af180d0)
Oct 10 05:07:52 np0005479823 NetworkManager[856]: <info>  [1760087272.5679] device (eth1): Activation: starting connection 'Wired connection 1' (9070ab9c-fab6-3aab-b68b-48035af180d0)
Oct 10 05:07:52 np0005479823 NetworkManager[856]: <info>  [1760087272.5679] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 05:07:52 np0005479823 NetworkManager[856]: <info>  [1760087272.5681] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 05:07:52 np0005479823 NetworkManager[856]: <info>  [1760087272.5686] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 05:07:52 np0005479823 NetworkManager[856]: <info>  [1760087272.5690] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct 10 05:07:52 np0005479823 systemd[1055]: Starting Mark boot as successful...
Oct 10 05:07:52 np0005479823 systemd[1055]: Finished Mark boot as successful.
Oct 10 05:07:53 np0005479823 systemd-logind[796]: New session 3 of user zuul.
Oct 10 05:07:53 np0005479823 systemd[1]: Started Session 3 of User zuul.
Oct 10 05:07:53 np0005479823 python3[3732]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163efc-24cc-dbf0-3472-0000000001f6-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:08:03 np0005479823 python3[3812]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 05:08:04 np0005479823 python3[3885]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760087283.6884868-206-60086575068802/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=f779573b6ceb38b51d12ffbc9edceedeba50f1e9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:08:05 np0005479823 python3[3935]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 05:08:05 np0005479823 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Oct 10 05:08:05 np0005479823 systemd[1]: Stopped Network Manager Wait Online.
Oct 10 05:08:05 np0005479823 systemd[1]: Stopping Network Manager Wait Online...
Oct 10 05:08:05 np0005479823 systemd[1]: Stopping Network Manager...
Oct 10 05:08:05 np0005479823 NetworkManager[856]: <info>  [1760087285.0358] caught SIGTERM, shutting down normally.
Oct 10 05:08:05 np0005479823 NetworkManager[856]: <info>  [1760087285.0370] dhcp4 (eth0): canceled DHCP transaction
Oct 10 05:08:05 np0005479823 NetworkManager[856]: <info>  [1760087285.0370] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 10 05:08:05 np0005479823 NetworkManager[856]: <info>  [1760087285.0370] dhcp4 (eth0): state changed no lease
Oct 10 05:08:05 np0005479823 NetworkManager[856]: <info>  [1760087285.0373] manager: NetworkManager state is now CONNECTING
Oct 10 05:08:05 np0005479823 NetworkManager[856]: <info>  [1760087285.0519] dhcp4 (eth1): canceled DHCP transaction
Oct 10 05:08:05 np0005479823 NetworkManager[856]: <info>  [1760087285.0519] dhcp4 (eth1): state changed no lease
Oct 10 05:08:05 np0005479823 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 10 05:08:05 np0005479823 NetworkManager[856]: <info>  [1760087285.0632] exiting (success)
Oct 10 05:08:05 np0005479823 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 10 05:08:05 np0005479823 systemd[1]: NetworkManager.service: Deactivated successfully.
Oct 10 05:08:05 np0005479823 systemd[1]: Stopped Network Manager.
Oct 10 05:08:05 np0005479823 systemd[1]: NetworkManager.service: Consumed 1.241s CPU time, 10.2M memory peak.
Oct 10 05:08:05 np0005479823 systemd[1]: Starting Network Manager...
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.1220] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:d2fa8de7-cb1e-4362-bed6-d8a2357f049b)
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.1222] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.1279] manager[0x561a7f298070]: monitoring kernel firmware directory '/lib/firmware'.
Oct 10 05:08:05 np0005479823 systemd[1]: Starting Hostname Service...
Oct 10 05:08:05 np0005479823 systemd[1]: Started Hostname Service.
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2356] hostname: hostname: using hostnamed
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2357] hostname: static hostname changed from (none) to "np0005479823.novalocal"
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2365] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2372] manager[0x561a7f298070]: rfkill: Wi-Fi hardware radio set enabled
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2373] manager[0x561a7f298070]: rfkill: WWAN hardware radio set enabled
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2404] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2405] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2405] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2406] manager: Networking is enabled by state file
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2408] settings: Loaded settings plugin: keyfile (internal)
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2412] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2438] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2446] dhcp: init: Using DHCP client 'internal'
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2449] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2454] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2460] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2469] device (lo): Activation: starting connection 'lo' (b2f4c0ce-6660-4aa4-ac06-17229f19cc05)
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2477] device (eth0): carrier: link connected
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2482] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2489] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2489] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2496] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2502] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2509] device (eth1): carrier: link connected
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2513] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2519] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (9070ab9c-fab6-3aab-b68b-48035af180d0) (indicated)
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2519] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2525] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2532] device (eth1): Activation: starting connection 'Wired connection 1' (9070ab9c-fab6-3aab-b68b-48035af180d0)
Oct 10 05:08:05 np0005479823 systemd[1]: Started Network Manager.
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2540] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2546] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2550] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2553] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2557] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2563] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2566] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2571] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2575] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2582] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2586] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2595] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2598] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2612] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2617] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2622] device (lo): Activation: successful, device activated.
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2637] dhcp4 (eth0): state changed new lease, address=38.102.83.22
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2643] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2720] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2744] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2745] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2750] manager: NetworkManager state is now CONNECTED_SITE
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2754] device (eth0): Activation: successful, device activated.
Oct 10 05:08:05 np0005479823 NetworkManager[3947]: <info>  [1760087285.2760] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct 10 05:08:05 np0005479823 systemd[1]: Starting Network Manager Wait Online...
Oct 10 05:08:05 np0005479823 python3[4019]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163efc-24cc-dbf0-3472-0000000000d3-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:08:15 np0005479823 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 10 05:08:35 np0005479823 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 10 05:08:50 np0005479823 NetworkManager[3947]: <info>  [1760087330.3890] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct 10 05:08:50 np0005479823 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 10 05:08:50 np0005479823 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 10 05:08:50 np0005479823 NetworkManager[3947]: <info>  [1760087330.4203] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct 10 05:08:50 np0005479823 NetworkManager[3947]: <info>  [1760087330.4205] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct 10 05:08:50 np0005479823 NetworkManager[3947]: <info>  [1760087330.4214] device (eth1): Activation: successful, device activated.
Oct 10 05:08:50 np0005479823 NetworkManager[3947]: <info>  [1760087330.4221] manager: startup complete
Oct 10 05:08:50 np0005479823 NetworkManager[3947]: <info>  [1760087330.4223] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Oct 10 05:08:50 np0005479823 NetworkManager[3947]: <warn>  [1760087330.4228] device (eth1): Activation: failed for connection 'Wired connection 1'
Oct 10 05:08:50 np0005479823 NetworkManager[3947]: <info>  [1760087330.4236] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Oct 10 05:08:50 np0005479823 systemd[1]: Finished Network Manager Wait Online.
Oct 10 05:08:50 np0005479823 NetworkManager[3947]: <info>  [1760087330.4358] dhcp4 (eth1): canceled DHCP transaction
Oct 10 05:08:50 np0005479823 NetworkManager[3947]: <info>  [1760087330.4359] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct 10 05:08:50 np0005479823 NetworkManager[3947]: <info>  [1760087330.4359] dhcp4 (eth1): state changed no lease
Oct 10 05:08:50 np0005479823 NetworkManager[3947]: <info>  [1760087330.4371] policy: auto-activating connection 'ci-private-network' (97070329-66da-5289-8aaa-712e43fb35a8)
Oct 10 05:08:50 np0005479823 NetworkManager[3947]: <info>  [1760087330.4375] device (eth1): Activation: starting connection 'ci-private-network' (97070329-66da-5289-8aaa-712e43fb35a8)
Oct 10 05:08:50 np0005479823 NetworkManager[3947]: <info>  [1760087330.4376] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 05:08:50 np0005479823 NetworkManager[3947]: <info>  [1760087330.4379] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 05:08:50 np0005479823 NetworkManager[3947]: <info>  [1760087330.4384] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 05:08:50 np0005479823 NetworkManager[3947]: <info>  [1760087330.4391] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 05:08:50 np0005479823 NetworkManager[3947]: <info>  [1760087330.4433] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 05:08:50 np0005479823 NetworkManager[3947]: <info>  [1760087330.4435] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 05:08:50 np0005479823 NetworkManager[3947]: <info>  [1760087330.4449] device (eth1): Activation: successful, device activated.
Oct 10 05:09:00 np0005479823 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 10 05:09:05 np0005479823 systemd[1]: session-3.scope: Deactivated successfully.
Oct 10 05:09:05 np0005479823 systemd[1]: session-3.scope: Consumed 1.645s CPU time.
Oct 10 05:09:05 np0005479823 systemd-logind[796]: Session 3 logged out. Waiting for processes to exit.
Oct 10 05:09:05 np0005479823 systemd-logind[796]: Removed session 3.
Oct 10 05:09:19 np0005479823 systemd-logind[796]: New session 4 of user zuul.
Oct 10 05:09:19 np0005479823 systemd[1]: Started Session 4 of User zuul.
Oct 10 05:09:20 np0005479823 python3[4128]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 05:09:20 np0005479823 python3[4201]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760087359.9086897-373-209249118085449/source _original_basename=tmp2_e38k7a follow=False checksum=0edcb8668707f95c4678608a04fc39cdafb654ec backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:09:23 np0005479823 systemd[1]: session-4.scope: Deactivated successfully.
Oct 10 05:09:23 np0005479823 systemd-logind[796]: Session 4 logged out. Waiting for processes to exit.
Oct 10 05:09:23 np0005479823 systemd-logind[796]: Removed session 4.
Oct 10 05:10:59 np0005479823 systemd[1055]: Created slice User Background Tasks Slice.
Oct 10 05:10:59 np0005479823 systemd[1055]: Starting Cleanup of User's Temporary Files and Directories...
Oct 10 05:10:59 np0005479823 systemd[1055]: Finished Cleanup of User's Temporary Files and Directories.
Oct 10 05:15:52 np0005479823 systemd-logind[796]: New session 5 of user zuul.
Oct 10 05:15:52 np0005479823 systemd[1]: Started Session 5 of User zuul.
Oct 10 05:15:52 np0005479823 python3[4259]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163efc-24cc-305a-504c-000000001cfe-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:15:53 np0005479823 python3[4287]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:15:53 np0005479823 python3[4314]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:15:53 np0005479823 python3[4340]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:15:54 np0005479823 python3[4366]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:15:54 np0005479823 python3[4392]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/systemd/system.conf regexp=^#DefaultIOAccounting=no line=DefaultIOAccounting=yes state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:15:54 np0005479823 python3[4392]: ansible-ansible.builtin.lineinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Oct 10 05:15:55 np0005479823 python3[4418]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 10 05:15:55 np0005479823 systemd[1]: Reloading.
Oct 10 05:15:55 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:15:57 np0005479823 python3[4475]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Oct 10 05:15:57 np0005479823 python3[4501]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:15:57 np0005479823 python3[4529]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:15:58 np0005479823 python3[4557]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:15:58 np0005479823 python3[4585]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:15:59 np0005479823 python3[4613]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163efc-24cc-305a-504c-000000001d04-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:15:59 np0005479823 python3[4643]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:16:02 np0005479823 systemd[1]: session-5.scope: Deactivated successfully.
Oct 10 05:16:02 np0005479823 systemd[1]: session-5.scope: Consumed 3.552s CPU time.
Oct 10 05:16:02 np0005479823 systemd-logind[796]: Session 5 logged out. Waiting for processes to exit.
Oct 10 05:16:02 np0005479823 systemd-logind[796]: Removed session 5.
Oct 10 05:16:04 np0005479823 systemd-logind[796]: New session 6 of user zuul.
Oct 10 05:16:04 np0005479823 systemd[1]: Started Session 6 of User zuul.
Oct 10 05:16:04 np0005479823 python3[4677]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Oct 10 05:16:37 np0005479823 kernel: SELinux:  Converting 363 SID table entries...
Oct 10 05:16:37 np0005479823 kernel: SELinux:  policy capability network_peer_controls=1
Oct 10 05:16:37 np0005479823 kernel: SELinux:  policy capability open_perms=1
Oct 10 05:16:37 np0005479823 kernel: SELinux:  policy capability extended_socket_class=1
Oct 10 05:16:37 np0005479823 kernel: SELinux:  policy capability always_check_network=0
Oct 10 05:16:37 np0005479823 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 10 05:16:37 np0005479823 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 10 05:16:37 np0005479823 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 10 05:16:47 np0005479823 kernel: SELinux:  Converting 363 SID table entries...
Oct 10 05:16:47 np0005479823 kernel: SELinux:  policy capability network_peer_controls=1
Oct 10 05:16:47 np0005479823 kernel: SELinux:  policy capability open_perms=1
Oct 10 05:16:47 np0005479823 kernel: SELinux:  policy capability extended_socket_class=1
Oct 10 05:16:47 np0005479823 kernel: SELinux:  policy capability always_check_network=0
Oct 10 05:16:47 np0005479823 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 10 05:16:47 np0005479823 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 10 05:16:47 np0005479823 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 10 05:16:58 np0005479823 kernel: SELinux:  Converting 363 SID table entries...
Oct 10 05:16:58 np0005479823 kernel: SELinux:  policy capability network_peer_controls=1
Oct 10 05:16:58 np0005479823 kernel: SELinux:  policy capability open_perms=1
Oct 10 05:16:58 np0005479823 kernel: SELinux:  policy capability extended_socket_class=1
Oct 10 05:16:58 np0005479823 kernel: SELinux:  policy capability always_check_network=0
Oct 10 05:16:58 np0005479823 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 10 05:16:58 np0005479823 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 10 05:16:58 np0005479823 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 10 05:17:00 np0005479823 setsebool[4746]: The virt_use_nfs policy boolean was changed to 1 by root
Oct 10 05:17:00 np0005479823 setsebool[4746]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Oct 10 05:17:13 np0005479823 kernel: SELinux:  Converting 366 SID table entries...
Oct 10 05:17:13 np0005479823 kernel: SELinux:  policy capability network_peer_controls=1
Oct 10 05:17:13 np0005479823 kernel: SELinux:  policy capability open_perms=1
Oct 10 05:17:13 np0005479823 kernel: SELinux:  policy capability extended_socket_class=1
Oct 10 05:17:13 np0005479823 kernel: SELinux:  policy capability always_check_network=0
Oct 10 05:17:13 np0005479823 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 10 05:17:13 np0005479823 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 10 05:17:13 np0005479823 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 10 05:17:34 np0005479823 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Oct 10 05:17:35 np0005479823 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 10 05:17:35 np0005479823 systemd[1]: Starting man-db-cache-update.service...
Oct 10 05:17:35 np0005479823 systemd[1]: Reloading.
Oct 10 05:17:35 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:17:35 np0005479823 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 10 05:17:39 np0005479823 systemd[1]: Starting PackageKit Daemon...
Oct 10 05:17:39 np0005479823 systemd[1]: Starting Authorization Manager...
Oct 10 05:17:39 np0005479823 polkitd[7343]: Started polkitd version 0.117
Oct 10 05:17:39 np0005479823 systemd[1]: Started Authorization Manager.
Oct 10 05:17:39 np0005479823 systemd[1]: Started PackageKit Daemon.
Oct 10 05:17:41 np0005479823 python3[8205]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163efc-24cc-c8da-0a8f-00000000000c-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:17:42 np0005479823 kernel: evm: overlay not supported
Oct 10 05:17:42 np0005479823 systemd[1055]: Starting D-Bus User Message Bus...
Oct 10 05:17:42 np0005479823 dbus-broker-launch[9120]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Oct 10 05:17:42 np0005479823 dbus-broker-launch[9120]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Oct 10 05:17:42 np0005479823 systemd[1055]: Started D-Bus User Message Bus.
Oct 10 05:17:42 np0005479823 dbus-broker-lau[9120]: Ready
Oct 10 05:17:42 np0005479823 systemd[1055]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Oct 10 05:17:42 np0005479823 systemd[1055]: Created slice Slice /user.
Oct 10 05:17:42 np0005479823 systemd[1055]: podman-9015.scope: unit configures an IP firewall, but not running as root.
Oct 10 05:17:42 np0005479823 systemd[1055]: (This warning is only shown for the first unit using IP firewalling.)
Oct 10 05:17:42 np0005479823 systemd[1055]: Started podman-9015.scope.
Oct 10 05:17:42 np0005479823 systemd[1055]: Started podman-pause-c7e5e74d.scope.
Oct 10 05:17:43 np0005479823 systemd-logind[796]: Session 6 logged out. Waiting for processes to exit.
Oct 10 05:17:43 np0005479823 systemd[1]: session-6.scope: Deactivated successfully.
Oct 10 05:17:43 np0005479823 systemd[1]: session-6.scope: Consumed 1min 5.623s CPU time.
Oct 10 05:17:43 np0005479823 systemd-logind[796]: Removed session 6.
Oct 10 05:18:01 np0005479823 systemd-logind[796]: New session 7 of user zuul.
Oct 10 05:18:01 np0005479823 systemd[1]: Started Session 7 of User zuul.
Oct 10 05:18:01 np0005479823 python3[17408]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLKa/9QXUogxywf992nox1ioEGXyzZloryP7qu5KhbNyvfDQXbxckfHpSRrx2tURERGS47wcXt32qRf5GMN12x0= zuul@np0005479820.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:18:01 np0005479823 python3[17609]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLKa/9QXUogxywf992nox1ioEGXyzZloryP7qu5KhbNyvfDQXbxckfHpSRrx2tURERGS47wcXt32qRf5GMN12x0= zuul@np0005479820.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:18:02 np0005479823 python3[17923]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005479823.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Oct 10 05:18:03 np0005479823 python3[18176]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLKa/9QXUogxywf992nox1ioEGXyzZloryP7qu5KhbNyvfDQXbxckfHpSRrx2tURERGS47wcXt32qRf5GMN12x0= zuul@np0005479820.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 10 05:18:03 np0005479823 python3[18452]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 05:18:04 np0005479823 python3[18716]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1760087883.4802418-153-276830333064873/source _original_basename=tmpqu6aj1g9 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:18:05 np0005479823 python3[19066]: ansible-ansible.builtin.hostname Invoked with name=compute-2 use=systemd
Oct 10 05:18:05 np0005479823 systemd[1]: Starting Hostname Service...
Oct 10 05:18:05 np0005479823 systemd[1]: Started Hostname Service.
Oct 10 05:18:05 np0005479823 systemd-hostnamed[19168]: Changed pretty hostname to 'compute-2'
Oct 10 05:18:05 np0005479823 systemd-hostnamed[19168]: Hostname set to <compute-2> (static)
Oct 10 05:18:05 np0005479823 NetworkManager[3947]: <info>  [1760087885.2535] hostname: static hostname changed from "np0005479823.novalocal" to "compute-2"
Oct 10 05:18:05 np0005479823 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 10 05:18:05 np0005479823 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 10 05:18:05 np0005479823 systemd[1]: session-7.scope: Deactivated successfully.
Oct 10 05:18:05 np0005479823 systemd[1]: session-7.scope: Consumed 2.412s CPU time.
Oct 10 05:18:05 np0005479823 systemd-logind[796]: Session 7 logged out. Waiting for processes to exit.
Oct 10 05:18:05 np0005479823 systemd-logind[796]: Removed session 7.
Oct 10 05:18:12 np0005479823 irqbalance[790]: Cannot change IRQ 27 affinity: Operation not permitted
Oct 10 05:18:12 np0005479823 irqbalance[790]: IRQ 27 affinity is now unmanaged
Oct 10 05:18:15 np0005479823 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 10 05:18:34 np0005479823 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 10 05:18:34 np0005479823 systemd[1]: Finished man-db-cache-update.service.
Oct 10 05:18:34 np0005479823 systemd[1]: man-db-cache-update.service: Consumed 55.078s CPU time.
Oct 10 05:18:34 np0005479823 systemd[1]: run-r8859adfa7e2241e3b266999aa66918e0.service: Deactivated successfully.
Oct 10 05:18:35 np0005479823 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 10 05:19:59 np0005479823 systemd[1]: Starting Cleanup of Temporary Directories...
Oct 10 05:19:59 np0005479823 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Oct 10 05:19:59 np0005479823 systemd[1]: Finished Cleanup of Temporary Directories.
Oct 10 05:19:59 np0005479823 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Oct 10 05:21:41 np0005479823 systemd-logind[796]: New session 8 of user zuul.
Oct 10 05:21:41 np0005479823 systemd[1]: Started Session 8 of User zuul.
Oct 10 05:21:41 np0005479823 python3[26605]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:21:43 np0005479823 python3[26721]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 05:21:44 np0005479823 python3[26794]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1760088103.3310978-30672-223803033374013/source mode=0755 _original_basename=delorean.repo follow=False checksum=c02c26d38f431b15f6463fc53c3d93ed5138ff07 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:21:44 np0005479823 python3[26820]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 05:21:44 np0005479823 python3[26893]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1760088103.3310978-30672-223803033374013/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:21:44 np0005479823 python3[26919]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 05:21:45 np0005479823 python3[26992]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1760088103.3310978-30672-223803033374013/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:21:45 np0005479823 python3[27018]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 05:21:45 np0005479823 python3[27091]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1760088103.3310978-30672-223803033374013/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:21:46 np0005479823 python3[27117]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 05:21:46 np0005479823 python3[27190]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1760088103.3310978-30672-223803033374013/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:21:46 np0005479823 python3[27216]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 05:21:47 np0005479823 python3[27289]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1760088103.3310978-30672-223803033374013/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:21:47 np0005479823 python3[27315]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 05:21:47 np0005479823 python3[27388]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1760088103.3310978-30672-223803033374013/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=75ca8f9fe9a538824fd094f239c30e8ce8652e8a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:21:59 np0005479823 python3[27436]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:22:44 np0005479823 systemd[1]: packagekit.service: Deactivated successfully.
Oct 10 05:26:59 np0005479823 systemd[1]: session-8.scope: Deactivated successfully.
Oct 10 05:26:59 np0005479823 systemd[1]: session-8.scope: Consumed 4.898s CPU time.
Oct 10 05:26:59 np0005479823 systemd-logind[796]: Session 8 logged out. Waiting for processes to exit.
Oct 10 05:26:59 np0005479823 systemd-logind[796]: Removed session 8.
Oct 10 05:33:17 np0005479823 systemd-logind[796]: New session 9 of user zuul.
Oct 10 05:33:17 np0005479823 systemd[1]: Started Session 9 of User zuul.
Oct 10 05:33:18 np0005479823 python3.9[27598]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:33:20 np0005479823 python3.9[27779]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:33:28 np0005479823 systemd[1]: session-9.scope: Deactivated successfully.
Oct 10 05:33:28 np0005479823 systemd[1]: session-9.scope: Consumed 8.321s CPU time.
Oct 10 05:33:28 np0005479823 systemd-logind[796]: Session 9 logged out. Waiting for processes to exit.
Oct 10 05:33:28 np0005479823 systemd-logind[796]: Removed session 9.
Oct 10 05:33:43 np0005479823 systemd-logind[796]: New session 10 of user zuul.
Oct 10 05:33:43 np0005479823 systemd[1]: Started Session 10 of User zuul.
Oct 10 05:33:44 np0005479823 python3.9[27989]: ansible-ansible.legacy.ping Invoked with data=pong
Oct 10 05:33:45 np0005479823 python3.9[28163]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:33:46 np0005479823 python3.9[28315]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:33:47 np0005479823 python3.9[28468]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:33:48 np0005479823 python3.9[28620]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:33:49 np0005479823 python3.9[28772]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:33:50 np0005479823 python3.9[28895]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1760088829.1501343-179-244385191920953/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:33:51 np0005479823 python3.9[29047]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:33:52 np0005479823 irqbalance[790]: Cannot change IRQ 26 affinity: Operation not permitted
Oct 10 05:33:52 np0005479823 irqbalance[790]: IRQ 26 affinity is now unmanaged
Oct 10 05:33:52 np0005479823 python3.9[29203]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:33:53 np0005479823 python3.9[29353]: ansible-ansible.builtin.service_facts Invoked
Oct 10 05:34:00 np0005479823 python3.9[29608]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:34:01 np0005479823 python3.9[29758]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:34:02 np0005479823 python3.9[29912]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:34:03 np0005479823 python3.9[30070]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 05:34:04 np0005479823 python3.9[30154]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 05:34:48 np0005479823 systemd[1]: Reloading.
Oct 10 05:34:48 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:34:48 np0005479823 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Oct 10 05:34:49 np0005479823 systemd[1]: Reloading.
Oct 10 05:34:49 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:34:49 np0005479823 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Oct 10 05:34:49 np0005479823 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Oct 10 05:34:49 np0005479823 systemd[1]: Reloading.
Oct 10 05:34:49 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:34:49 np0005479823 systemd[1]: Listening on LVM2 poll daemon socket.
Oct 10 05:34:49 np0005479823 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Oct 10 05:34:49 np0005479823 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Oct 10 05:34:49 np0005479823 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Oct 10 05:35:55 np0005479823 kernel: SELinux:  Converting 2713 SID table entries...
Oct 10 05:35:55 np0005479823 kernel: SELinux:  policy capability network_peer_controls=1
Oct 10 05:35:55 np0005479823 kernel: SELinux:  policy capability open_perms=1
Oct 10 05:35:55 np0005479823 kernel: SELinux:  policy capability extended_socket_class=1
Oct 10 05:35:55 np0005479823 kernel: SELinux:  policy capability always_check_network=0
Oct 10 05:35:55 np0005479823 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 10 05:35:55 np0005479823 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 10 05:35:55 np0005479823 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 10 05:35:55 np0005479823 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Oct 10 05:35:55 np0005479823 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 10 05:35:56 np0005479823 systemd[1]: Starting man-db-cache-update.service...
Oct 10 05:35:56 np0005479823 systemd[1]: Reloading.
Oct 10 05:35:56 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:35:56 np0005479823 systemd[1]: Starting dnf makecache...
Oct 10 05:35:56 np0005479823 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 10 05:35:56 np0005479823 dnf[30762]: Failed determining last makecache time.
Oct 10 05:35:56 np0005479823 systemd[1]: Starting PackageKit Daemon...
Oct 10 05:35:56 np0005479823 dnf[30762]: delorean-openstack-barbican-42b4c41831408a8e323 103 kB/s | 3.0 kB     00:00
Oct 10 05:35:56 np0005479823 dnf[30762]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 173 kB/s | 3.0 kB     00:00
Oct 10 05:35:56 np0005479823 dnf[30762]: delorean-openstack-cinder-1c00d6490d88e436f26ef 165 kB/s | 3.0 kB     00:00
Oct 10 05:35:56 np0005479823 systemd[1]: Started PackageKit Daemon.
Oct 10 05:35:56 np0005479823 dnf[30762]: delorean-python-stevedore-c4acc5639fd2329372142 177 kB/s | 3.0 kB     00:00
Oct 10 05:35:56 np0005479823 dnf[30762]: delorean-python-cloudkitty-tests-tempest-3961dc 174 kB/s | 3.0 kB     00:00
Oct 10 05:35:56 np0005479823 dnf[30762]: delorean-diskimage-builder-43381184423c185801b5 103 kB/s | 3.0 kB     00:00
Oct 10 05:35:56 np0005479823 dnf[30762]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 157 kB/s | 3.0 kB     00:00
Oct 10 05:35:56 np0005479823 dnf[30762]: delorean-python-designate-tests-tempest-347fdbc 163 kB/s | 3.0 kB     00:00
Oct 10 05:35:56 np0005479823 dnf[30762]: delorean-openstack-glance-1fd12c29b339f30fe823e 172 kB/s | 3.0 kB     00:00
Oct 10 05:35:56 np0005479823 dnf[30762]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 174 kB/s | 3.0 kB     00:00
Oct 10 05:35:56 np0005479823 dnf[30762]: delorean-openstack-manila-3c01b7181572c95dac462 182 kB/s | 3.0 kB     00:00
Oct 10 05:35:56 np0005479823 dnf[30762]: delorean-python-vmware-nsxlib-458234972d1428ac9 170 kB/s | 3.0 kB     00:00
Oct 10 05:35:56 np0005479823 dnf[30762]: delorean-openstack-octavia-ba397f07a7331190208c 184 kB/s | 3.0 kB     00:00
Oct 10 05:35:56 np0005479823 dnf[30762]: delorean-openstack-watcher-c014f81a8647287f6dcc 176 kB/s | 3.0 kB     00:00
Oct 10 05:35:56 np0005479823 dnf[30762]: delorean-edpm-image-builder-55ba53cf215b14ed95b 139 kB/s | 3.0 kB     00:00
Oct 10 05:35:56 np0005479823 dnf[30762]: delorean-puppet-ceph-b0c245ccde541a63fde0564366 158 kB/s | 3.0 kB     00:00
Oct 10 05:35:56 np0005479823 dnf[30762]: delorean-openstack-swift-dc98a8463506ac520c469a 127 kB/s | 3.0 kB     00:00
Oct 10 05:35:56 np0005479823 dnf[30762]: delorean-python-tempestconf-8515371b7cceebd4282 117 kB/s | 3.0 kB     00:00
Oct 10 05:35:56 np0005479823 dnf[30762]: delorean-openstack-heat-ui-013accbfd179753bc3f0 128 kB/s | 3.0 kB     00:00
Oct 10 05:35:57 np0005479823 dnf[30762]: CentOS Stream 9 - BaseOS                         24 kB/s | 6.7 kB     00:00
Oct 10 05:35:57 np0005479823 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 10 05:35:57 np0005479823 systemd[1]: Finished man-db-cache-update.service.
Oct 10 05:35:57 np0005479823 systemd[1]: man-db-cache-update.service: Consumed 1.418s CPU time.
Oct 10 05:35:57 np0005479823 systemd[1]: run-r7e7070006be94f99af409294bec20270.service: Deactivated successfully.
Oct 10 05:35:57 np0005479823 dnf[30762]: CentOS Stream 9 - AppStream                      44 kB/s | 6.8 kB     00:00
Oct 10 05:35:57 np0005479823 dnf[30762]: CentOS Stream 9 - CRB                            64 kB/s | 6.6 kB     00:00
Oct 10 05:35:57 np0005479823 dnf[30762]: CentOS Stream 9 - Extras packages                74 kB/s | 8.0 kB     00:00
Oct 10 05:35:57 np0005479823 dnf[30762]: dlrn-antelope-testing                           168 kB/s | 3.0 kB     00:00
Oct 10 05:35:57 np0005479823 dnf[30762]: dlrn-antelope-build-deps                        181 kB/s | 3.0 kB     00:00
Oct 10 05:35:57 np0005479823 dnf[30762]: centos9-rabbitmq                                 51 kB/s | 3.0 kB     00:00
Oct 10 05:35:57 np0005479823 dnf[30762]: centos9-storage                                  77 kB/s | 3.0 kB     00:00
Oct 10 05:35:58 np0005479823 dnf[30762]: centos9-opstools                                 78 kB/s | 3.0 kB     00:00
Oct 10 05:35:58 np0005479823 dnf[30762]: NFV SIG OpenvSwitch                             106 kB/s | 3.0 kB     00:00
Oct 10 05:35:58 np0005479823 dnf[30762]: repo-setup-centos-appstream                     210 kB/s | 4.4 kB     00:00
Oct 10 05:35:58 np0005479823 dnf[30762]: repo-setup-centos-baseos                        149 kB/s | 3.9 kB     00:00
Oct 10 05:35:58 np0005479823 dnf[30762]: repo-setup-centos-highavailability               38 kB/s | 3.9 kB     00:00
Oct 10 05:35:58 np0005479823 dnf[30762]: repo-setup-centos-powertools                    157 kB/s | 4.3 kB     00:00
Oct 10 05:35:58 np0005479823 dnf[30762]: Extra Packages for Enterprise Linux 9 - x86_64  106 kB/s |  25 kB     00:00
Oct 10 05:35:59 np0005479823 dnf[30762]: Metadata cache created.
Oct 10 05:35:59 np0005479823 systemd[1]: dnf-makecache.service: Deactivated successfully.
Oct 10 05:35:59 np0005479823 systemd[1]: Finished dnf makecache.
Oct 10 05:35:59 np0005479823 systemd[1]: dnf-makecache.service: Consumed 1.945s CPU time.
Oct 10 05:36:01 np0005479823 python3.9[31705]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:36:03 np0005479823 python3.9[31986]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Oct 10 05:36:04 np0005479823 python3.9[32138]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Oct 10 05:36:08 np0005479823 python3.9[32291]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:36:09 np0005479823 python3.9[32444]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Oct 10 05:36:10 np0005479823 python3.9[32596]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:36:16 np0005479823 python3.9[32748]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:36:17 np0005479823 python3.9[32871]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760088970.9191601-642-223959905449872/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=588de6fcfc4f8f2f1febb9ce163ed2886e4b0ed4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:36:19 np0005479823 python3.9[33023]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Oct 10 05:36:20 np0005479823 python3.9[33176]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 10 05:36:21 np0005479823 python3.9[33334]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 10 05:36:21 np0005479823 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 05:36:22 np0005479823 python3.9[33495]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Oct 10 05:36:23 np0005479823 python3.9[33648]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 10 05:36:24 np0005479823 python3.9[33806]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Oct 10 05:36:25 np0005479823 python3.9[33958]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 05:36:28 np0005479823 python3.9[34111]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:36:28 np0005479823 python3.9[34263]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:36:29 np0005479823 python3.9[34386]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760088988.3519216-927-115084464521278/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:36:30 np0005479823 python3.9[34538]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 05:36:30 np0005479823 systemd[1]: Starting Load Kernel Modules...
Oct 10 05:36:30 np0005479823 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Oct 10 05:36:30 np0005479823 kernel: Bridge firewalling registered
Oct 10 05:36:30 np0005479823 systemd-modules-load[34542]: Inserted module 'br_netfilter'
Oct 10 05:36:30 np0005479823 systemd[1]: Finished Load Kernel Modules.
Oct 10 05:36:31 np0005479823 python3.9[34697]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:36:32 np0005479823 python3.9[34820]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760088991.0718472-996-21999562878414/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:36:33 np0005479823 python3.9[34972]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 05:36:36 np0005479823 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Oct 10 05:36:36 np0005479823 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Oct 10 05:36:36 np0005479823 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 10 05:36:36 np0005479823 systemd[1]: Starting man-db-cache-update.service...
Oct 10 05:36:36 np0005479823 systemd[1]: Reloading.
Oct 10 05:36:37 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:36:37 np0005479823 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 10 05:36:38 np0005479823 python3.9[36648]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:36:39 np0005479823 python3.9[37588]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Oct 10 05:36:40 np0005479823 python3.9[38424]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:36:41 np0005479823 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 10 05:36:41 np0005479823 systemd[1]: Finished man-db-cache-update.service.
Oct 10 05:36:41 np0005479823 systemd[1]: man-db-cache-update.service: Consumed 5.230s CPU time.
Oct 10 05:36:41 np0005479823 systemd[1]: run-r05c4a1e4465e47d28f03753e58d0450b.service: Deactivated successfully.
Oct 10 05:36:41 np0005479823 python3.9[39145]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:36:41 np0005479823 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct 10 05:36:41 np0005479823 systemd[1]: Started Dynamic System Tuning Daemon.
Oct 10 05:36:43 np0005479823 python3.9[39518]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:36:43 np0005479823 systemd[1]: Stopping Dynamic System Tuning Daemon...
Oct 10 05:36:43 np0005479823 systemd[1]: tuned.service: Deactivated successfully.
Oct 10 05:36:43 np0005479823 systemd[1]: Stopped Dynamic System Tuning Daemon.
Oct 10 05:36:43 np0005479823 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct 10 05:36:43 np0005479823 systemd[1]: Started Dynamic System Tuning Daemon.
Oct 10 05:36:44 np0005479823 python3.9[39680]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Oct 10 05:36:48 np0005479823 python3.9[39832]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:36:48 np0005479823 systemd[1]: Reloading.
Oct 10 05:36:48 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:36:49 np0005479823 python3.9[40022]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:36:49 np0005479823 systemd[1]: Reloading.
Oct 10 05:36:49 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:36:50 np0005479823 python3.9[40211]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:36:51 np0005479823 python3.9[40364]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:36:51 np0005479823 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Oct 10 05:36:52 np0005479823 python3.9[40517]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:36:54 np0005479823 python3.9[40679]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:36:55 np0005479823 python3.9[40832]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 05:36:55 np0005479823 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct 10 05:36:55 np0005479823 systemd[1]: Stopped Apply Kernel Variables.
Oct 10 05:36:55 np0005479823 systemd[1]: Stopping Apply Kernel Variables...
Oct 10 05:36:55 np0005479823 systemd[1]: Starting Apply Kernel Variables...
Oct 10 05:36:55 np0005479823 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct 10 05:36:55 np0005479823 systemd[1]: Finished Apply Kernel Variables.
Oct 10 05:36:57 np0005479823 systemd[1]: session-10.scope: Deactivated successfully.
Oct 10 05:36:57 np0005479823 systemd[1]: session-10.scope: Consumed 2min 18.721s CPU time.
Oct 10 05:36:57 np0005479823 systemd-logind[796]: Session 10 logged out. Waiting for processes to exit.
Oct 10 05:36:57 np0005479823 systemd-logind[796]: Removed session 10.
Oct 10 05:37:02 np0005479823 systemd-logind[796]: New session 11 of user zuul.
Oct 10 05:37:02 np0005479823 systemd[1]: Started Session 11 of User zuul.
Oct 10 05:37:03 np0005479823 python3.9[41016]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:37:05 np0005479823 python3.9[41172]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Oct 10 05:37:06 np0005479823 python3.9[41325]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 10 05:37:07 np0005479823 python3.9[41483]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 10 05:37:08 np0005479823 python3.9[41643]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 05:37:09 np0005479823 python3.9[41727]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 10 05:37:12 np0005479823 python3.9[41891]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 05:37:25 np0005479823 kernel: SELinux:  Converting 2724 SID table entries...
Oct 10 05:37:25 np0005479823 kernel: SELinux:  policy capability network_peer_controls=1
Oct 10 05:37:25 np0005479823 kernel: SELinux:  policy capability open_perms=1
Oct 10 05:37:25 np0005479823 kernel: SELinux:  policy capability extended_socket_class=1
Oct 10 05:37:25 np0005479823 kernel: SELinux:  policy capability always_check_network=0
Oct 10 05:37:25 np0005479823 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 10 05:37:25 np0005479823 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 10 05:37:25 np0005479823 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 10 05:37:25 np0005479823 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Oct 10 05:37:25 np0005479823 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Oct 10 05:37:27 np0005479823 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 10 05:37:27 np0005479823 systemd[1]: Starting man-db-cache-update.service...
Oct 10 05:37:27 np0005479823 systemd[1]: Reloading.
Oct 10 05:37:27 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:37:27 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:37:27 np0005479823 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 10 05:37:28 np0005479823 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 10 05:37:28 np0005479823 systemd[1]: Finished man-db-cache-update.service.
Oct 10 05:37:28 np0005479823 systemd[1]: run-r43b98f29bd7b4209b060fbf0717fdefb.service: Deactivated successfully.
Oct 10 05:37:29 np0005479823 python3.9[42992]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 10 05:37:29 np0005479823 systemd[1]: Reloading.
Oct 10 05:37:29 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:37:29 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:37:29 np0005479823 systemd[1]: Starting Open vSwitch Database Unit...
Oct 10 05:37:29 np0005479823 chown[43033]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Oct 10 05:37:29 np0005479823 ovs-ctl[43038]: /etc/openvswitch/conf.db does not exist ... (warning).
Oct 10 05:37:29 np0005479823 ovs-ctl[43038]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Oct 10 05:37:29 np0005479823 ovs-ctl[43038]: Starting ovsdb-server [  OK  ]
Oct 10 05:37:29 np0005479823 ovs-vsctl[43087]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Oct 10 05:37:29 np0005479823 ovs-vsctl[43107]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"49146ebb-575d-4bd4-816c-0b242fb944ee\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Oct 10 05:37:29 np0005479823 ovs-ctl[43038]: Configuring Open vSwitch system IDs [  OK  ]
Oct 10 05:37:29 np0005479823 ovs-vsctl[43113]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Oct 10 05:37:29 np0005479823 ovs-ctl[43038]: Enabling remote OVSDB managers [  OK  ]
Oct 10 05:37:29 np0005479823 systemd[1]: Started Open vSwitch Database Unit.
Oct 10 05:37:29 np0005479823 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Oct 10 05:37:29 np0005479823 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Oct 10 05:37:29 np0005479823 systemd[1]: Starting Open vSwitch Forwarding Unit...
Oct 10 05:37:29 np0005479823 kernel: openvswitch: Open vSwitch switching datapath
Oct 10 05:37:29 np0005479823 ovs-ctl[43159]: Inserting openvswitch module [  OK  ]
Oct 10 05:37:30 np0005479823 ovs-ctl[43127]: Starting ovs-vswitchd [  OK  ]
Oct 10 05:37:30 np0005479823 ovs-vsctl[43178]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Oct 10 05:37:30 np0005479823 ovs-ctl[43127]: Enabling remote OVSDB managers [  OK  ]
Oct 10 05:37:30 np0005479823 systemd[1]: Started Open vSwitch Forwarding Unit.
Oct 10 05:37:30 np0005479823 systemd[1]: Starting Open vSwitch...
Oct 10 05:37:30 np0005479823 systemd[1]: Finished Open vSwitch.
Oct 10 05:37:31 np0005479823 python3.9[43329]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:37:32 np0005479823 python3.9[43481]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Oct 10 05:37:33 np0005479823 kernel: SELinux:  Converting 2738 SID table entries...
Oct 10 05:37:33 np0005479823 kernel: SELinux:  policy capability network_peer_controls=1
Oct 10 05:37:33 np0005479823 kernel: SELinux:  policy capability open_perms=1
Oct 10 05:37:33 np0005479823 kernel: SELinux:  policy capability extended_socket_class=1
Oct 10 05:37:33 np0005479823 kernel: SELinux:  policy capability always_check_network=0
Oct 10 05:37:33 np0005479823 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 10 05:37:33 np0005479823 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 10 05:37:33 np0005479823 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 10 05:37:35 np0005479823 python3.9[43636]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:37:35 np0005479823 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Oct 10 05:37:36 np0005479823 python3.9[43794]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 05:37:38 np0005479823 python3.9[43947]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:37:39 np0005479823 python3.9[44234]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 10 05:37:40 np0005479823 python3.9[44384]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:37:41 np0005479823 python3.9[44538]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 05:37:43 np0005479823 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 10 05:37:43 np0005479823 systemd[1]: Starting man-db-cache-update.service...
Oct 10 05:37:43 np0005479823 systemd[1]: Reloading.
Oct 10 05:37:43 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:37:43 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:37:43 np0005479823 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 10 05:37:43 np0005479823 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 10 05:37:43 np0005479823 systemd[1]: Finished man-db-cache-update.service.
Oct 10 05:37:43 np0005479823 systemd[1]: run-r53a3a03dafc243e28330f71e5c2c08d8.service: Deactivated successfully.
Oct 10 05:37:44 np0005479823 python3.9[44855]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 05:37:44 np0005479823 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Oct 10 05:37:44 np0005479823 systemd[1]: Stopped Network Manager Wait Online.
Oct 10 05:37:44 np0005479823 systemd[1]: Stopping Network Manager Wait Online...
Oct 10 05:37:44 np0005479823 systemd[1]: Stopping Network Manager...
Oct 10 05:37:44 np0005479823 NetworkManager[3947]: <info>  [1760089064.8229] caught SIGTERM, shutting down normally.
Oct 10 05:37:44 np0005479823 NetworkManager[3947]: <info>  [1760089064.8241] dhcp4 (eth0): canceled DHCP transaction
Oct 10 05:37:44 np0005479823 NetworkManager[3947]: <info>  [1760089064.8242] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 10 05:37:44 np0005479823 NetworkManager[3947]: <info>  [1760089064.8242] dhcp4 (eth0): state changed no lease
Oct 10 05:37:44 np0005479823 NetworkManager[3947]: <info>  [1760089064.8244] manager: NetworkManager state is now CONNECTED_SITE
Oct 10 05:37:44 np0005479823 NetworkManager[3947]: <info>  [1760089064.8317] exiting (success)
Oct 10 05:37:44 np0005479823 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 10 05:37:44 np0005479823 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 10 05:37:44 np0005479823 systemd[1]: NetworkManager.service: Deactivated successfully.
Oct 10 05:37:44 np0005479823 systemd[1]: Stopped Network Manager.
Oct 10 05:37:44 np0005479823 systemd[1]: NetworkManager.service: Consumed 9.978s CPU time, 4.1M memory peak, read 0B from disk, written 15.5K to disk.
Oct 10 05:37:44 np0005479823 systemd[1]: Starting Network Manager...
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.8975] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:d2fa8de7-cb1e-4362-bed6-d8a2357f049b)
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.8977] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.9031] manager[0x56502bf26090]: monitoring kernel firmware directory '/lib/firmware'.
Oct 10 05:37:44 np0005479823 systemd[1]: Starting Hostname Service...
Oct 10 05:37:44 np0005479823 systemd[1]: Started Hostname Service.
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.9802] hostname: hostname: using hostnamed
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.9802] hostname: static hostname changed from (none) to "compute-2"
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.9808] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.9813] manager[0x56502bf26090]: rfkill: Wi-Fi hardware radio set enabled
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.9813] manager[0x56502bf26090]: rfkill: WWAN hardware radio set enabled
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.9841] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.9851] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.9852] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.9853] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.9854] manager: Networking is enabled by state file
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.9856] settings: Loaded settings plugin: keyfile (internal)
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.9861] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.9896] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.9908] dhcp: init: Using DHCP client 'internal'
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.9912] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.9920] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.9926] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.9935] device (lo): Activation: starting connection 'lo' (b2f4c0ce-6660-4aa4-ac06-17229f19cc05)
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.9944] device (eth0): carrier: link connected
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.9949] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.9954] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.9955] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.9962] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.9971] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.9980] device (eth1): carrier: link connected
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.9986] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.9994] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (97070329-66da-5289-8aaa-712e43fb35a8) (indicated)
Oct 10 05:37:44 np0005479823 NetworkManager[44866]: <info>  [1760089064.9995] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0004] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0015] device (eth1): Activation: starting connection 'ci-private-network' (97070329-66da-5289-8aaa-712e43fb35a8)
Oct 10 05:37:45 np0005479823 systemd[1]: Started Network Manager.
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0027] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0321] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0324] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0327] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0330] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0332] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0334] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0335] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0341] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0347] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0350] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0373] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0383] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0391] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0392] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0397] device (lo): Activation: successful, device activated.
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0403] dhcp4 (eth0): state changed new lease, address=38.102.83.22
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0409] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct 10 05:37:45 np0005479823 systemd[1]: Starting Network Manager Wait Online...
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0480] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0487] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0494] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0498] manager: NetworkManager state is now CONNECTED_LOCAL
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0502] device (eth1): Activation: successful, device activated.
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0512] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0513] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0517] manager: NetworkManager state is now CONNECTED_SITE
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0522] device (eth0): Activation: successful, device activated.
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0526] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct 10 05:37:45 np0005479823 NetworkManager[44866]: <info>  [1760089065.0557] manager: startup complete
Oct 10 05:37:45 np0005479823 systemd[1]: Finished Network Manager Wait Online.
Oct 10 05:37:45 np0005479823 python3.9[45082]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 05:37:51 np0005479823 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 10 05:37:51 np0005479823 systemd[1]: Starting man-db-cache-update.service...
Oct 10 05:37:51 np0005479823 systemd[1]: Reloading.
Oct 10 05:37:51 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:37:51 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:37:51 np0005479823 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 10 05:37:52 np0005479823 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 10 05:37:52 np0005479823 systemd[1]: Finished man-db-cache-update.service.
Oct 10 05:37:52 np0005479823 systemd[1]: run-raa7ab797ccac4f57a8f8a7c82554e885.service: Deactivated successfully.
Oct 10 05:37:55 np0005479823 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 10 05:37:55 np0005479823 python3.9[45544]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:37:56 np0005479823 python3.9[45696]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:37:57 np0005479823 python3.9[45850]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:37:57 np0005479823 python3.9[46002]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:37:58 np0005479823 python3.9[46154]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:37:59 np0005479823 python3.9[46306]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:38:00 np0005479823 python3.9[46458]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:38:01 np0005479823 python3.9[46581]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1760089079.8330784-650-90231972118996/.source _original_basename=.v6b9jex6 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:38:01 np0005479823 python3.9[46733]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:38:02 np0005479823 python3.9[46885]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Oct 10 05:38:03 np0005479823 python3.9[47037]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:38:06 np0005479823 python3.9[47464]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Oct 10 05:38:07 np0005479823 ansible-async_wrapper.py[47639]: Invoked with j919480864945 300 /home/zuul/.ansible/tmp/ansible-tmp-1760089086.6110146-847-87432343928043/AnsiballZ_edpm_os_net_config.py _
Oct 10 05:38:07 np0005479823 ansible-async_wrapper.py[47642]: Starting module and watcher
Oct 10 05:38:07 np0005479823 ansible-async_wrapper.py[47642]: Start watching 47643 (300)
Oct 10 05:38:07 np0005479823 ansible-async_wrapper.py[47643]: Start module (47643)
Oct 10 05:38:07 np0005479823 ansible-async_wrapper.py[47639]: Return async_wrapper task started.
Oct 10 05:38:07 np0005479823 python3.9[47644]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Oct 10 05:38:08 np0005479823 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Oct 10 05:38:08 np0005479823 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Oct 10 05:38:08 np0005479823 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Oct 10 05:38:08 np0005479823 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Oct 10 05:38:08 np0005479823 kernel: cfg80211: failed to load regulatory.db
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.4239] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=47645 uid=0 result="success"
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.4256] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=47645 uid=0 result="success"
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.4752] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.4754] audit: op="connection-add" uuid="a4ef5fcb-9a34-4167-b950-5e9b3ed48e8a" name="br-ex-br" pid=47645 uid=0 result="success"
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.4770] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.4771] audit: op="connection-add" uuid="6ffc8a82-12bb-4304-b569-c4c1a4a906ee" name="br-ex-port" pid=47645 uid=0 result="success"
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.4785] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.4787] audit: op="connection-add" uuid="1dc9e23e-eafb-472b-bc4f-5974c5384b39" name="eth1-port" pid=47645 uid=0 result="success"
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.4800] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.4801] audit: op="connection-add" uuid="bb9ab156-ca23-43ce-9fed-d3863964f080" name="vlan20-port" pid=47645 uid=0 result="success"
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.4816] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.4818] audit: op="connection-add" uuid="1d7d8008-311f-4f5a-9aba-3e7542f88a0c" name="vlan21-port" pid=47645 uid=0 result="success"
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.4830] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.4831] audit: op="connection-add" uuid="6221bcee-a358-45d6-92e9-7684dff6685a" name="vlan22-port" pid=47645 uid=0 result="success"
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.4844] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.4845] audit: op="connection-add" uuid="5173518e-a62a-4bd6-b50d-9a0011562c36" name="vlan23-port" pid=47645 uid=0 result="success"
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.4865] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.autoconnect-priority,connection.timestamp,ipv6.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode" pid=47645 uid=0 result="success"
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.4879] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.4880] audit: op="connection-add" uuid="c0226397-8fe0-490f-96fc-b4257f699165" name="br-ex-if" pid=47645 uid=0 result="success"
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.4933] audit: op="connection-update" uuid="97070329-66da-5289-8aaa-712e43fb35a8" name="ci-private-network" args="ipv4.routing-rules,ipv4.addresses,ipv4.method,ipv4.never-default,ipv4.dns,ipv4.routes,connection.controller,connection.port-type,connection.slave-type,connection.master,connection.timestamp,ipv6.routing-rules,ipv6.addresses,ipv6.method,ipv6.addr-gen-mode,ipv6.dns,ipv6.routes,ovs-external-ids.data,ovs-interface.type" pid=47645 uid=0 result="success"
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.4945] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.4946] audit: op="connection-add" uuid="6198a381-3767-409f-ba8e-52460604a9a6" name="vlan20-if" pid=47645 uid=0 result="success"
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.4959] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.4960] audit: op="connection-add" uuid="592c8809-2f1b-409d-9bb5-155d2c80c0a5" name="vlan21-if" pid=47645 uid=0 result="success"
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.4974] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.4975] audit: op="connection-add" uuid="a4a3f96a-e8f4-4c43-aa05-97308511e250" name="vlan22-if" pid=47645 uid=0 result="success"
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.4989] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.4990] audit: op="connection-add" uuid="abd24fa1-cceb-41ce-a984-04a6bd98b8cb" name="vlan23-if" pid=47645 uid=0 result="success"
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5000] audit: op="connection-delete" uuid="9070ab9c-fab6-3aab-b68b-48035af180d0" name="Wired connection 1" pid=47645 uid=0 result="success"
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5010] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5019] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5022] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (a4ef5fcb-9a34-4167-b950-5e9b3ed48e8a)
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5022] audit: op="connection-activate" uuid="a4ef5fcb-9a34-4167-b950-5e9b3ed48e8a" name="br-ex-br" pid=47645 uid=0 result="success"
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5023] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5028] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5031] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (6ffc8a82-12bb-4304-b569-c4c1a4a906ee)
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5032] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5036] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5039] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (1dc9e23e-eafb-472b-bc4f-5974c5384b39)
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5040] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5044] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5047] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (bb9ab156-ca23-43ce-9fed-d3863964f080)
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5049] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5053] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5056] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (1d7d8008-311f-4f5a-9aba-3e7542f88a0c)
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5057] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5061] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5064] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (6221bcee-a358-45d6-92e9-7684dff6685a)
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5065] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5070] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5073] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (5173518e-a62a-4bd6-b50d-9a0011562c36)
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5073] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5075] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5076] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5080] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5084] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5087] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (c0226397-8fe0-490f-96fc-b4257f699165)
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5087] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5089] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5090] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5091] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5092] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5101] device (eth1): disconnecting for new activation request.
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5101] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5103] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5104] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5105] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5107] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5110] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5112] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (6198a381-3767-409f-ba8e-52460604a9a6)
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5113] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5115] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5116] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5116] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5118] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5121] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5124] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (592c8809-2f1b-409d-9bb5-155d2c80c0a5)
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5124] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5126] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5127] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5128] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5129] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5132] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5135] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (a4a3f96a-e8f4-4c43-aa05-97308511e250)
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5136] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5137] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5139] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5139] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5141] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5144] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5150] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (abd24fa1-cceb-41ce-a984-04a6bd98b8cb)
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5151] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5153] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5154] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5155] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5156] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5166] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.autoconnect-priority,ipv6.method,ipv6.addr-gen-mode" pid=47645 uid=0 result="success"
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5168] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5170] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5171] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5176] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5180] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5183] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5185] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5186] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5190] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5193] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 kernel: ovs-system: entered promiscuous mode
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5195] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5196] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5200] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5202] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5204] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5206] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5209] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5211] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5213] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5215] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 kernel: Timeout policy base is empty
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5220] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5226] dhcp4 (eth0): canceled DHCP transaction
Oct 10 05:38:09 np0005479823 systemd-udevd[47650]: Network interface NamePolicy= disabled on kernel command line.
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5228] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5228] dhcp4 (eth0): state changed no lease
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5230] dhcp4 (eth0): activation: beginning transaction (no timeout)
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5241] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5244] audit: op="device-reapply" interface="eth1" ifindex=3 pid=47645 uid=0 result="fail" reason="Device is not activated"
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5249] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5283] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5285] dhcp4 (eth0): state changed new lease, address=38.102.83.22
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5291] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Oct 10 05:38:09 np0005479823 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5329] device (eth1): disconnecting for new activation request.
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5330] audit: op="connection-activate" uuid="97070329-66da-5289-8aaa-712e43fb35a8" name="ci-private-network" pid=47645 uid=0 result="success"
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5331] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5335] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5424] device (eth1): Activation: starting connection 'ci-private-network' (97070329-66da-5289-8aaa-712e43fb35a8)
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5428] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5442] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5446] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5453] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5457] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5462] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5464] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5466] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5468] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5470] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5472] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5474] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=47645 uid=0 result="success"
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5477] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5483] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5489] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5493] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5497] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5501] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5505] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5509] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5513] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5517] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5521] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5525] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5529] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5534] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5537] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 kernel: br-ex: entered promiscuous mode
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5581] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5583] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5589] device (eth1): Activation: successful, device activated.
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5681] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5694] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 kernel: vlan22: entered promiscuous mode
Oct 10 05:38:09 np0005479823 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5724] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5725] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5731] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 10 05:38:09 np0005479823 kernel: vlan23: entered promiscuous mode
Oct 10 05:38:09 np0005479823 systemd-udevd[47649]: Network interface NamePolicy= disabled on kernel command line.
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5853] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5867] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5884] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5885] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5892] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 10 05:38:09 np0005479823 kernel: vlan20: entered promiscuous mode
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5932] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5947] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5962] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5963] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.5970] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 10 05:38:09 np0005479823 kernel: vlan21: entered promiscuous mode
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.6031] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.6044] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.6059] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.6060] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.6066] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.6097] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.6110] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.6123] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.6124] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 10 05:38:09 np0005479823 NetworkManager[44866]: <info>  [1760089089.6130] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 10 05:38:10 np0005479823 NetworkManager[44866]: <info>  [1760089090.7288] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=47645 uid=0 result="success"
Oct 10 05:38:10 np0005479823 NetworkManager[44866]: <info>  [1760089090.9024] checkpoint[0x56502befc950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Oct 10 05:38:10 np0005479823 NetworkManager[44866]: <info>  [1760089090.9025] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=47645 uid=0 result="success"
Oct 10 05:38:11 np0005479823 NetworkManager[44866]: <info>  [1760089091.1450] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=47645 uid=0 result="success"
Oct 10 05:38:11 np0005479823 NetworkManager[44866]: <info>  [1760089091.1463] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=47645 uid=0 result="success"
Oct 10 05:38:11 np0005479823 NetworkManager[44866]: <info>  [1760089091.3273] audit: op="networking-control" arg="global-dns-configuration" pid=47645 uid=0 result="success"
Oct 10 05:38:11 np0005479823 NetworkManager[44866]: <info>  [1760089091.3301] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Oct 10 05:38:11 np0005479823 NetworkManager[44866]: <info>  [1760089091.3329] audit: op="networking-control" arg="global-dns-configuration" pid=47645 uid=0 result="success"
Oct 10 05:38:11 np0005479823 NetworkManager[44866]: <info>  [1760089091.3346] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=47645 uid=0 result="success"
Oct 10 05:38:11 np0005479823 python3.9[48004]: ansible-ansible.legacy.async_status Invoked with jid=j919480864945.47639 mode=status _async_dir=/root/.ansible_async
Oct 10 05:38:11 np0005479823 NetworkManager[44866]: <info>  [1760089091.5134] checkpoint[0x56502befca20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Oct 10 05:38:11 np0005479823 NetworkManager[44866]: <info>  [1760089091.5138] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=47645 uid=0 result="success"
Oct 10 05:38:11 np0005479823 ansible-async_wrapper.py[47643]: Module complete (47643)
Oct 10 05:38:12 np0005479823 ansible-async_wrapper.py[47642]: Done in kid B.
Oct 10 05:38:14 np0005479823 python3.9[48108]: ansible-ansible.legacy.async_status Invoked with jid=j919480864945.47639 mode=status _async_dir=/root/.ansible_async
Oct 10 05:38:14 np0005479823 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 10 05:38:15 np0005479823 python3.9[48210]: ansible-ansible.legacy.async_status Invoked with jid=j919480864945.47639 mode=cleanup _async_dir=/root/.ansible_async
Oct 10 05:38:16 np0005479823 python3.9[48362]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:38:16 np0005479823 python3.9[48485]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760089095.8322594-928-59179713041241/.source.returncode _original_basename=.x1t8il0z follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:38:17 np0005479823 python3.9[48637]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:38:18 np0005479823 python3.9[48761]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760089097.23963-976-280840722254279/.source.cfg _original_basename=.0ujjls6y follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:38:19 np0005479823 python3.9[48913]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 05:38:19 np0005479823 systemd[1]: Reloading Network Manager...
Oct 10 05:38:19 np0005479823 NetworkManager[44866]: <info>  [1760089099.5072] audit: op="reload" arg="0" pid=48917 uid=0 result="success"
Oct 10 05:38:19 np0005479823 NetworkManager[44866]: <info>  [1760089099.5078] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Oct 10 05:38:19 np0005479823 systemd[1]: Reloaded Network Manager.
Oct 10 05:38:20 np0005479823 systemd[1]: session-11.scope: Deactivated successfully.
Oct 10 05:38:20 np0005479823 systemd[1]: session-11.scope: Consumed 53.606s CPU time.
Oct 10 05:38:20 np0005479823 systemd-logind[796]: Session 11 logged out. Waiting for processes to exit.
Oct 10 05:38:20 np0005479823 systemd-logind[796]: Removed session 11.
Oct 10 05:38:25 np0005479823 systemd-logind[796]: New session 12 of user zuul.
Oct 10 05:38:25 np0005479823 systemd[1]: Started Session 12 of User zuul.
Oct 10 05:38:26 np0005479823 python3.9[49101]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:38:27 np0005479823 python3.9[49255]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 05:38:29 np0005479823 python3.9[49449]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:38:29 np0005479823 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 10 05:38:29 np0005479823 systemd[1]: session-12.scope: Deactivated successfully.
Oct 10 05:38:29 np0005479823 systemd[1]: session-12.scope: Consumed 2.384s CPU time.
Oct 10 05:38:29 np0005479823 systemd-logind[796]: Session 12 logged out. Waiting for processes to exit.
Oct 10 05:38:29 np0005479823 systemd-logind[796]: Removed session 12.
Oct 10 05:38:35 np0005479823 systemd-logind[796]: New session 13 of user zuul.
Oct 10 05:38:35 np0005479823 systemd[1]: Started Session 13 of User zuul.
Oct 10 05:38:36 np0005479823 python3.9[49631]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:38:37 np0005479823 python3.9[49785]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:38:38 np0005479823 python3.9[49942]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 05:38:39 np0005479823 python3.9[50026]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 05:38:41 np0005479823 python3.9[50180]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 05:38:42 np0005479823 python3.9[50375]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:38:43 np0005479823 python3.9[50527]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:38:43 np0005479823 systemd[1]: var-lib-containers-storage-overlay-compat3112466313-merged.mount: Deactivated successfully.
Oct 10 05:38:43 np0005479823 podman[50528]: 2025-10-10 09:38:43.934936336 +0000 UTC m=+0.067274571 system refresh
Oct 10 05:38:44 np0005479823 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 05:38:45 np0005479823 python3.9[50690]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:38:45 np0005479823 python3.9[50813]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760089124.321155-199-136222184130636/.source.json follow=False _original_basename=podman_network_config.j2 checksum=bc749cbd8dd097470751e86f47dabc032c51f5ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:38:46 np0005479823 python3.9[50965]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:38:47 np0005479823 python3.9[51088]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760089126.0124109-245-101609462851517/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:38:48 np0005479823 python3.9[51240]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:38:48 np0005479823 python3.9[51392]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:38:49 np0005479823 python3.9[51544]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:38:50 np0005479823 python3.9[51696]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:38:51 np0005479823 python3.9[51848]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 05:38:53 np0005479823 python3.9[52001]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:38:54 np0005479823 python3.9[52155]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:38:55 np0005479823 python3.9[52307]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:38:56 np0005479823 python3.9[52459]: ansible-service_facts Invoked
Oct 10 05:38:56 np0005479823 network[52476]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 10 05:38:56 np0005479823 network[52477]: 'network-scripts' will be removed from distribution in near future.
Oct 10 05:38:56 np0005479823 network[52478]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 10 05:39:02 np0005479823 python3.9[52932]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 05:39:05 np0005479823 python3.9[53085]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Oct 10 05:39:07 np0005479823 python3.9[53237]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:39:07 np0005479823 python3.9[53362]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760089146.6440942-641-59427780666903/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:39:08 np0005479823 python3.9[53516]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:39:09 np0005479823 python3.9[53641]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760089148.2733202-687-247464561094948/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:39:11 np0005479823 python3.9[53795]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:39:12 np0005479823 python3.9[53949]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 05:39:14 np0005479823 python3.9[54033]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:39:16 np0005479823 python3.9[54187]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 05:39:17 np0005479823 python3.9[54271]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 05:39:17 np0005479823 chronyd[785]: chronyd exiting
Oct 10 05:39:17 np0005479823 systemd[1]: Stopping NTP client/server...
Oct 10 05:39:17 np0005479823 systemd[1]: chronyd.service: Deactivated successfully.
Oct 10 05:39:17 np0005479823 systemd[1]: Stopped NTP client/server.
Oct 10 05:39:17 np0005479823 systemd[1]: Starting NTP client/server...
Oct 10 05:39:17 np0005479823 chronyd[54279]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct 10 05:39:17 np0005479823 chronyd[54279]: Frequency -32.237 +/- 0.239 ppm read from /var/lib/chrony/drift
Oct 10 05:39:17 np0005479823 chronyd[54279]: Loaded seccomp filter (level 2)
Oct 10 05:39:17 np0005479823 systemd[1]: Started NTP client/server.
Oct 10 05:39:18 np0005479823 systemd[1]: session-13.scope: Deactivated successfully.
Oct 10 05:39:18 np0005479823 systemd[1]: session-13.scope: Consumed 26.664s CPU time.
Oct 10 05:39:18 np0005479823 systemd-logind[796]: Session 13 logged out. Waiting for processes to exit.
Oct 10 05:39:18 np0005479823 systemd-logind[796]: Removed session 13.
Oct 10 05:39:24 np0005479823 systemd-logind[796]: New session 14 of user zuul.
Oct 10 05:39:24 np0005479823 systemd[1]: Started Session 14 of User zuul.
Oct 10 05:39:24 np0005479823 python3.9[54460]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:39:25 np0005479823 python3.9[54612]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:39:26 np0005479823 python3.9[54735]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760089165.1210947-64-152240516221048/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:39:26 np0005479823 systemd[1]: session-14.scope: Deactivated successfully.
Oct 10 05:39:26 np0005479823 systemd[1]: session-14.scope: Consumed 1.570s CPU time.
Oct 10 05:39:26 np0005479823 systemd-logind[796]: Session 14 logged out. Waiting for processes to exit.
Oct 10 05:39:26 np0005479823 systemd-logind[796]: Removed session 14.
Oct 10 05:39:32 np0005479823 systemd-logind[796]: New session 15 of user zuul.
Oct 10 05:39:32 np0005479823 systemd[1]: Started Session 15 of User zuul.
Oct 10 05:39:33 np0005479823 python3.9[54913]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:39:34 np0005479823 python3.9[55069]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:39:35 np0005479823 python3.9[55244]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:39:36 np0005479823 python3.9[55367]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1760089174.75691-85-137419652153510/.source.json _original_basename=.omsu_u7_ follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:39:37 np0005479823 python3.9[55519]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:39:37 np0005479823 python3.9[55642]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760089176.8322563-154-102924880549786/.source _original_basename=.5nbob3pq follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:39:38 np0005479823 python3.9[55794]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:39:39 np0005479823 python3.9[55946]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:39:40 np0005479823 python3.9[56069]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760089179.2197025-226-54817425253456/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:39:40 np0005479823 python3.9[56221]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:39:41 np0005479823 python3.9[56344]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760089180.429285-226-196869450885258/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:39:42 np0005479823 python3.9[56496]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:39:43 np0005479823 python3.9[56648]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:39:44 np0005479823 python3.9[56771]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760089182.9457266-338-65113596835969/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:39:44 np0005479823 python3.9[56923]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:39:45 np0005479823 python3.9[57046]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760089184.3568327-383-210677791656687/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:39:46 np0005479823 python3.9[57198]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:39:46 np0005479823 systemd[1]: Reloading.
Oct 10 05:39:46 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:39:46 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:39:46 np0005479823 systemd[1]: Reloading.
Oct 10 05:39:46 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:39:46 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:39:47 np0005479823 systemd[1]: Starting EDPM Container Shutdown...
Oct 10 05:39:47 np0005479823 systemd[1]: Finished EDPM Container Shutdown.
Oct 10 05:39:47 np0005479823 python3.9[57425]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:39:48 np0005479823 python3.9[57548]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760089187.425473-452-9103470690166/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:39:49 np0005479823 python3.9[57700]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:39:49 np0005479823 python3.9[57823]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760089188.8040833-496-172739657953163/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:39:51 np0005479823 python3.9[57975]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:39:51 np0005479823 systemd[1]: Reloading.
Oct 10 05:39:51 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:39:51 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:39:51 np0005479823 systemd[1]: Reloading.
Oct 10 05:39:51 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:39:51 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:39:51 np0005479823 systemd[1]: Starting Create netns directory...
Oct 10 05:39:51 np0005479823 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 10 05:39:51 np0005479823 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 10 05:39:51 np0005479823 systemd[1]: Finished Create netns directory.
Oct 10 05:39:53 np0005479823 python3.9[58201]: ansible-ansible.builtin.service_facts Invoked
Oct 10 05:39:53 np0005479823 network[58218]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 10 05:39:53 np0005479823 network[58219]: 'network-scripts' will be removed from distribution in near future.
Oct 10 05:39:53 np0005479823 network[58220]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 10 05:39:59 np0005479823 python3.9[58484]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:39:59 np0005479823 systemd[1]: Reloading.
Oct 10 05:39:59 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:39:59 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:39:59 np0005479823 systemd[1]: Stopping IPv4 firewall with iptables...
Oct 10 05:39:59 np0005479823 iptables.init[58525]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Oct 10 05:40:00 np0005479823 iptables.init[58525]: iptables: Flushing firewall rules: [  OK  ]
Oct 10 05:40:00 np0005479823 systemd[1]: iptables.service: Deactivated successfully.
Oct 10 05:40:00 np0005479823 systemd[1]: Stopped IPv4 firewall with iptables.
Oct 10 05:40:00 np0005479823 python3.9[58722]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:40:01 np0005479823 python3.9[58876]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:40:02 np0005479823 systemd[1]: Reloading.
Oct 10 05:40:02 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:40:02 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:40:02 np0005479823 systemd[1]: Starting Netfilter Tables...
Oct 10 05:40:02 np0005479823 systemd[1]: Finished Netfilter Tables.
Oct 10 05:40:03 np0005479823 python3.9[59068]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:40:04 np0005479823 python3.9[59221]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:40:05 np0005479823 python3.9[59346]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760089204.096978-704-157209696244298/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=4729b6ffc5b555fa142bf0b6e6dc15609cb89a22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:40:06 np0005479823 python3.9[59497]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 05:40:31 np0005479823 systemd[1]: session-15.scope: Deactivated successfully.
Oct 10 05:40:31 np0005479823 systemd[1]: session-15.scope: Consumed 20.405s CPU time.
Oct 10 05:40:31 np0005479823 systemd-logind[796]: Session 15 logged out. Waiting for processes to exit.
Oct 10 05:40:31 np0005479823 systemd-logind[796]: Removed session 15.
Oct 10 05:40:43 np0005479823 systemd-logind[796]: New session 16 of user zuul.
Oct 10 05:40:44 np0005479823 systemd[1]: Started Session 16 of User zuul.
Oct 10 05:40:45 np0005479823 python3.9[59690]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:40:46 np0005479823 python3.9[59846]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:40:47 np0005479823 python3.9[60021]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:40:47 np0005479823 python3.9[60099]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.xwfn6yqr recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:40:48 np0005479823 python3.9[60251]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:40:49 np0005479823 python3.9[60329]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.xbt1rhul recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:40:50 np0005479823 python3.9[60481]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:40:50 np0005479823 python3.9[60633]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:40:51 np0005479823 python3.9[60711]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:40:51 np0005479823 python3.9[60863]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:40:52 np0005479823 python3.9[60941]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:40:53 np0005479823 python3.9[61093]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:40:54 np0005479823 python3.9[61245]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:40:54 np0005479823 python3.9[61323]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:40:55 np0005479823 python3.9[61476]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:40:56 np0005479823 python3.9[61554]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:40:57 np0005479823 python3.9[61706]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:40:57 np0005479823 systemd[1]: Reloading.
Oct 10 05:40:57 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:40:57 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:40:58 np0005479823 python3.9[61894]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:40:59 np0005479823 python3.9[61972]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:40:59 np0005479823 python3.9[62124]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:41:00 np0005479823 python3.9[62202]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:41:01 np0005479823 python3.9[62354]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:41:01 np0005479823 systemd[1]: Reloading.
Oct 10 05:41:01 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:41:01 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:41:01 np0005479823 systemd[1]: Starting Create netns directory...
Oct 10 05:41:01 np0005479823 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 10 05:41:01 np0005479823 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 10 05:41:01 np0005479823 systemd[1]: Finished Create netns directory.
Oct 10 05:41:02 np0005479823 python3.9[62545]: ansible-ansible.builtin.service_facts Invoked
Oct 10 05:41:02 np0005479823 network[62562]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 10 05:41:02 np0005479823 network[62563]: 'network-scripts' will be removed from distribution in near future.
Oct 10 05:41:02 np0005479823 network[62564]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 10 05:41:07 np0005479823 python3.9[62827]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:41:08 np0005479823 python3.9[62905]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:41:09 np0005479823 python3.9[63057]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:41:09 np0005479823 python3.9[63209]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:41:10 np0005479823 python3.9[63332]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760089269.5043862-611-154740142628858/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:41:11 np0005479823 python3.9[63484]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct 10 05:41:11 np0005479823 systemd[1]: Starting Time & Date Service...
Oct 10 05:41:11 np0005479823 systemd[1]: Started Time & Date Service.
Oct 10 05:41:13 np0005479823 python3.9[63640]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:41:13 np0005479823 python3.9[63792]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:41:14 np0005479823 python3.9[63915]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760089273.3465185-715-137411755594630/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:41:15 np0005479823 python3.9[64067]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:41:15 np0005479823 python3.9[64190]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760089274.887268-760-28169979534399/.source.yaml _original_basename=.fbb2i29c follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:41:16 np0005479823 python3.9[64343]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:41:17 np0005479823 python3.9[64466]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760089276.3215218-806-162386162924705/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:41:18 np0005479823 python3.9[64618]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:41:19 np0005479823 python3.9[64771]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:41:20 np0005479823 python3[64924]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 10 05:41:21 np0005479823 python3.9[65076]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:41:21 np0005479823 python3.9[65199]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760089280.8458064-923-97021584640293/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:41:22 np0005479823 python3.9[65351]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:41:23 np0005479823 python3.9[65474]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760089282.419474-968-55220374828843/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:41:24 np0005479823 python3.9[65626]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:41:25 np0005479823 python3.9[65749]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760089283.9498124-1013-3757482558625/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:41:26 np0005479823 python3.9[65901]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:41:26 np0005479823 chronyd[54279]: Selected source 142.4.192.253 (pool.ntp.org)
Oct 10 05:41:26 np0005479823 python3.9[66024]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760089285.580667-1058-273865495106653/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:41:27 np0005479823 python3.9[66176]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:41:28 np0005479823 python3.9[66299]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760089287.0687494-1103-98774151432424/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:41:29 np0005479823 python3.9[66451]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:41:30 np0005479823 python3.9[66603]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:41:31 np0005479823 python3.9[66762]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:41:32 np0005479823 python3.9[66915]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:41:32 np0005479823 python3.9[67067]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:41:33 np0005479823 python3.9[67219]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct 10 05:41:34 np0005479823 python3.9[67372]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct 10 05:41:35 np0005479823 systemd-logind[796]: Session 16 logged out. Waiting for processes to exit.
Oct 10 05:41:35 np0005479823 systemd[1]: session-16.scope: Deactivated successfully.
Oct 10 05:41:35 np0005479823 systemd[1]: session-16.scope: Consumed 31.794s CPU time.
Oct 10 05:41:35 np0005479823 systemd-logind[796]: Removed session 16.
Oct 10 05:41:41 np0005479823 systemd-logind[796]: New session 17 of user zuul.
Oct 10 05:41:41 np0005479823 systemd[1]: Started Session 17 of User zuul.
Oct 10 05:41:41 np0005479823 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct 10 05:41:42 np0005479823 python3.9[67553]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Oct 10 05:41:43 np0005479823 python3.9[67707]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:41:44 np0005479823 python3.9[67859]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:41:45 np0005479823 python3.9[68011]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCs576V3VvbSgv48Ml4JM3ripPY5VUVh8vdkDr1njjfd7J/WrQQkTf/D0b7+eGTXj3Y1fx1/haVrDafo7g0NqcSZX+zNUgTCnYPWafo7RMG4Q7ITVk1NPIkAC1cDUxHNeWhXaOkxCz96sTkO4aNW3uoFjsp2JkJtRJmHzT7q/bc0N9x7YcWh9vwRRBiOKlV8cWMHuHUzOlloEQLN67Dht1xHWr1eO/SITqUlWY13tc/54xQuo8nBQNNX9ArhMbJz2a9AoNVUAAYFF8hWFI5ES/GL9qsCp8dnmAtrY4Rc07QmHo1RkcjXe1f6D+vymRIP3YOqIjlWp0blCTfcCGno5lBa9f5JachIsogk+5+GYx4AAbWLyxxecfKzdCxrGnQlfFgldc1xDN1RG+8HwFEAuHQDWTCDUgF67FXSHy7aVxrdzU4046193/o3VKTpSaJmFldASxFgyUeujs56OgC0qYM0zKV4jOsMBcocVHvH/1FOPWIr81XXYvu6C/Ntd6sBj0=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGSf7pFS/S1SmUMk/yMobwR+LTaQZlAhBqo7Ido5r8dg#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBB1l0EOuMseZ7ulHkfzzVtKv+5A9EWRy+oXVB+t370vohhJoN3+lviS8xoR8GttJUcHVCaeioniRtOWysbNdC0I=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDarlOcgDXqRdSww3oIuqu7nGIBJToNGSnU1ljOr6GTlHTxxOoTztIrvZrPaJA8w/ixztkhFZZSdRPw4meYayY05CNu9SneiL62twzDLDsqeDPAspkh69Ljj5aGCLf6GJDiK0m2h1jLDIFtXH3lIQE9781zA7ZQ8+/xeF4yRS1/Fb5CXDG+oi/J0veCffs6t0TYmrUfSgS2H2y0UxNu7C6GoQKRde1arPLOYexvlg2RjlWM6Ex4JCqTAd9EN330Kh4HUr3r46ET8mwi1mPndibbiW0heXgrg8FeV5hBqOxQsGgLEKpX1cNAz6Rr0C5Hg1xfGcsJtep88vbJFmMyV1jNowDtJCYpprqa16Nj35HBuuz7zbzVlIdeQhEJ9I4I7eNhUxlb2/XYRXy2hfsrM9D2TP7B+bVPLjlqgqy8stBhGBCtH32ppNsXHE6uGPHMovcz2VhbP/P3sp9NQV+hF2Q0RbBXrQZkEI9YJdhxQw5hyOqwfPrEEBFy8FpzSKfBAW0=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIC1nQuW/lbxVJxo9H20J7i0+Z6cHtufrF4VbA6zs724f#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBB0oTxSrAqx34tAubl7rouYPI7qhs6NhoDmGr3PTW1+mypEQw0EO+pZ99zSRnweC5RBoL080AgUKo7KN+v3LDHw=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUnwO+j5aInA4FKMx5pWF8B0Zp6L17GsYV5RBbu6iT67LtXjwbz5nP4EC7t80boMHnS7DRNCAxF0FNMVhQ9o4+1E1n2mrUxxAw8YxcZTabu/lAqRb4I6RzmXdXSA9mF8O3onswi/KhJg6YUTFEWCuxWrMLco15IatKi+hNqcRUk1DreR2L/YN0W5qXkvj1z3aoph1h3Yn1lRjuQDrVHp6lCywixC2pHwYG+CrPyX+0PkXJg+JRvRdxNCIw0D0zOkJrnppmT8XpIj42JLRUGGV592XFVXHiEhZdOI2bdzPy490EfIbWF9Symqi/V5vf8SK9LMOscHXkD7jsT6VKzsUXyk6/IzzZ2TzhD173lt8HpRJyaZq4ME0ZSVYNyD58DN/CQ3xpO1c1E8Wp4fUswc4WHmb/eILnY0lDXOZt6Hb/e+K6RHu5e5GOo0KSfei/LyrqJkBQn2P8UkbJvrUh2bNw+whjvT5CmXd3rPCw+Xq3/K3Gpit1K/4pC0zGC+CQr7E=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILklS4uW4IrGY5dWZTg4VeKVeFB3jPeUpu/8f4D1+rd5#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCelD2lLiMWT09YjxTI9IfdSnHfdMuHKAAEYFKZmJg34mgwUIDqUQqoc9I6a7Ps9pRizY+UpHWL//lD7hvvhD5k=#012 create=True mode=0644 path=/tmp/ansible.3fb8vn2c state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:41:46 np0005479823 python3.9[68163]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.3fb8vn2c' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:41:47 np0005479823 python3.9[68317]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.3fb8vn2c state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:41:47 np0005479823 systemd[1]: session-17.scope: Deactivated successfully.
Oct 10 05:41:47 np0005479823 systemd[1]: session-17.scope: Consumed 3.529s CPU time.
Oct 10 05:41:47 np0005479823 systemd-logind[796]: Session 17 logged out. Waiting for processes to exit.
Oct 10 05:41:47 np0005479823 systemd-logind[796]: Removed session 17.
Oct 10 05:41:53 np0005479823 systemd-logind[796]: New session 18 of user zuul.
Oct 10 05:41:53 np0005479823 systemd[1]: Started Session 18 of User zuul.
Oct 10 05:41:54 np0005479823 python3.9[68495]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:41:55 np0005479823 python3.9[68651]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct 10 05:41:56 np0005479823 python3.9[68805]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 05:41:57 np0005479823 python3.9[68958]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:41:58 np0005479823 python3.9[69111]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:41:59 np0005479823 python3.9[69265]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:42:00 np0005479823 python3.9[69420]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:42:01 np0005479823 systemd[1]: session-18.scope: Deactivated successfully.
Oct 10 05:42:01 np0005479823 systemd[1]: session-18.scope: Consumed 4.313s CPU time.
Oct 10 05:42:01 np0005479823 systemd-logind[796]: Session 18 logged out. Waiting for processes to exit.
Oct 10 05:42:01 np0005479823 systemd-logind[796]: Removed session 18.
Oct 10 05:42:06 np0005479823 systemd-logind[796]: New session 19 of user zuul.
Oct 10 05:42:06 np0005479823 systemd[1]: Started Session 19 of User zuul.
Oct 10 05:42:07 np0005479823 python3.9[69599]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:42:08 np0005479823 python3.9[69755]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 05:42:09 np0005479823 python3.9[69839]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 10 05:42:11 np0005479823 python3.9[69990]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:42:13 np0005479823 python3.9[70141]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 10 05:42:14 np0005479823 python3.9[70291]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:42:14 np0005479823 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 05:42:14 np0005479823 python3.9[70442]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:42:15 np0005479823 systemd[1]: session-19.scope: Deactivated successfully.
Oct 10 05:42:15 np0005479823 systemd[1]: session-19.scope: Consumed 5.810s CPU time.
Oct 10 05:42:15 np0005479823 systemd-logind[796]: Session 19 logged out. Waiting for processes to exit.
Oct 10 05:42:15 np0005479823 systemd-logind[796]: Removed session 19.
Oct 10 05:42:23 np0005479823 systemd-logind[796]: New session 20 of user zuul.
Oct 10 05:42:23 np0005479823 systemd[1]: Started Session 20 of User zuul.
Oct 10 05:42:29 np0005479823 python3[71208]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:42:31 np0005479823 python3[71303]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Oct 10 05:42:33 np0005479823 python3[71330]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:42:33 np0005479823 python3[71356]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=20G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:42:33 np0005479823 kernel: loop: module loaded
Oct 10 05:42:33 np0005479823 kernel: loop3: detected capacity change from 0 to 41943040
Oct 10 05:42:34 np0005479823 python3[71391]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:42:34 np0005479823 lvm[71394]: PV /dev/loop3 not used.
Oct 10 05:42:34 np0005479823 lvm[71396]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Oct 10 05:42:34 np0005479823 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Oct 10 05:42:34 np0005479823 lvm[71399]:  1 logical volume(s) in volume group "ceph_vg0" now active
Oct 10 05:42:34 np0005479823 lvm[71406]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Oct 10 05:42:34 np0005479823 lvm[71406]: VG ceph_vg0 finished
Oct 10 05:42:34 np0005479823 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Oct 10 05:42:34 np0005479823 python3[71484]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 10 05:42:35 np0005479823 python3[71557]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760089354.595812-33485-109348056988994/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:42:36 np0005479823 python3[71607]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:42:36 np0005479823 systemd[1]: Reloading.
Oct 10 05:42:36 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:42:36 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:42:36 np0005479823 systemd[1]: Starting Ceph OSD losetup...
Oct 10 05:42:36 np0005479823 bash[71648]: /dev/loop3: [64513]:4555204 (/var/lib/ceph-osd-0.img)
Oct 10 05:42:36 np0005479823 systemd[1]: Finished Ceph OSD losetup.
Oct 10 05:42:36 np0005479823 lvm[71649]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Oct 10 05:42:36 np0005479823 lvm[71649]: VG ceph_vg0 finished
Oct 10 05:42:38 np0005479823 python3[71673]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:42:56 np0005479823 systemd[1]: packagekit.service: Deactivated successfully.
Oct 10 05:44:10 np0005479823 systemd[1]: Created slice User Slice of UID 42477.
Oct 10 05:44:10 np0005479823 systemd[1]: Starting User Runtime Directory /run/user/42477...
Oct 10 05:44:10 np0005479823 systemd-logind[796]: New session 21 of user ceph-admin.
Oct 10 05:44:10 np0005479823 systemd[1]: Finished User Runtime Directory /run/user/42477.
Oct 10 05:44:10 np0005479823 systemd[1]: Starting User Manager for UID 42477...
Oct 10 05:44:11 np0005479823 systemd[71724]: Queued start job for default target Main User Target.
Oct 10 05:44:11 np0005479823 systemd[71724]: Created slice User Application Slice.
Oct 10 05:44:11 np0005479823 systemd[71724]: Started Mark boot as successful after the user session has run 2 minutes.
Oct 10 05:44:11 np0005479823 systemd[71724]: Started Daily Cleanup of User's Temporary Directories.
Oct 10 05:44:11 np0005479823 systemd[71724]: Reached target Paths.
Oct 10 05:44:11 np0005479823 systemd[71724]: Reached target Timers.
Oct 10 05:44:11 np0005479823 systemd[71724]: Starting D-Bus User Message Bus Socket...
Oct 10 05:44:11 np0005479823 systemd[71724]: Starting Create User's Volatile Files and Directories...
Oct 10 05:44:11 np0005479823 systemd[71724]: Finished Create User's Volatile Files and Directories.
Oct 10 05:44:11 np0005479823 systemd[71724]: Listening on D-Bus User Message Bus Socket.
Oct 10 05:44:11 np0005479823 systemd[71724]: Reached target Sockets.
Oct 10 05:44:11 np0005479823 systemd[71724]: Reached target Basic System.
Oct 10 05:44:11 np0005479823 systemd[71724]: Reached target Main User Target.
Oct 10 05:44:11 np0005479823 systemd[71724]: Startup finished in 137ms.
Oct 10 05:44:11 np0005479823 systemd[1]: Started User Manager for UID 42477.
Oct 10 05:44:11 np0005479823 systemd[1]: Started Session 21 of User ceph-admin.
Oct 10 05:44:11 np0005479823 systemd-logind[796]: New session 23 of user ceph-admin.
Oct 10 05:44:11 np0005479823 systemd[1]: Started Session 23 of User ceph-admin.
Oct 10 05:44:11 np0005479823 systemd-logind[796]: New session 24 of user ceph-admin.
Oct 10 05:44:11 np0005479823 systemd[1]: Started Session 24 of User ceph-admin.
Oct 10 05:44:11 np0005479823 systemd-logind[796]: New session 25 of user ceph-admin.
Oct 10 05:44:11 np0005479823 systemd[1]: Started Session 25 of User ceph-admin.
Oct 10 05:44:12 np0005479823 systemd-logind[796]: New session 26 of user ceph-admin.
Oct 10 05:44:12 np0005479823 systemd[1]: Started Session 26 of User ceph-admin.
Oct 10 05:44:12 np0005479823 systemd-logind[796]: New session 27 of user ceph-admin.
Oct 10 05:44:12 np0005479823 systemd[1]: Started Session 27 of User ceph-admin.
Oct 10 05:44:13 np0005479823 systemd-logind[796]: New session 28 of user ceph-admin.
Oct 10 05:44:13 np0005479823 systemd[1]: Started Session 28 of User ceph-admin.
Oct 10 05:44:13 np0005479823 systemd-logind[796]: New session 29 of user ceph-admin.
Oct 10 05:44:13 np0005479823 systemd[1]: Started Session 29 of User ceph-admin.
Oct 10 05:44:13 np0005479823 systemd-logind[796]: New session 30 of user ceph-admin.
Oct 10 05:44:13 np0005479823 systemd[1]: Started Session 30 of User ceph-admin.
Oct 10 05:44:14 np0005479823 systemd-logind[796]: New session 31 of user ceph-admin.
Oct 10 05:44:14 np0005479823 systemd[1]: Started Session 31 of User ceph-admin.
Oct 10 05:44:15 np0005479823 systemd-logind[796]: New session 32 of user ceph-admin.
Oct 10 05:44:15 np0005479823 systemd[1]: Started Session 32 of User ceph-admin.
Oct 10 05:44:15 np0005479823 systemd-logind[796]: New session 33 of user ceph-admin.
Oct 10 05:44:15 np0005479823 systemd[1]: Started Session 33 of User ceph-admin.
Oct 10 05:44:16 np0005479823 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 05:44:54 np0005479823 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 05:44:55 np0005479823 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 05:44:55 np0005479823 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 72325 (sysctl)
Oct 10 05:44:55 np0005479823 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 05:44:55 np0005479823 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Oct 10 05:44:55 np0005479823 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Oct 10 05:44:56 np0005479823 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 05:44:56 np0005479823 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 05:44:56 np0005479823 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 05:44:58 np0005479823 systemd[1]: var-lib-containers-storage-overlay-compat2331499709-merged.mount: Deactivated successfully.
Oct 10 05:44:59 np0005479823 systemd[1]: var-lib-containers-storage-overlay-compat2331499709-lower\x2dmapped.mount: Deactivated successfully.
Oct 10 05:45:13 np0005479823 podman[72501]: 2025-10-10 09:45:13.458217925 +0000 UTC m=+16.482446633 container create 063d54e38ca2a632e79dd75b4c61fe8deb94f9437b08bce0082b19a7a58a811f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nifty_knuth, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True)
Oct 10 05:45:13 np0005479823 systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck1332118266-merged.mount: Deactivated successfully.
Oct 10 05:45:13 np0005479823 systemd[1]: Created slice Virtual Machine and Container Slice.
Oct 10 05:45:13 np0005479823 systemd[1]: Started libpod-conmon-063d54e38ca2a632e79dd75b4c61fe8deb94f9437b08bce0082b19a7a58a811f.scope.
Oct 10 05:45:13 np0005479823 podman[72501]: 2025-10-10 09:45:13.443390322 +0000 UTC m=+16.467619080 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:45:13 np0005479823 systemd[1]: Started libcrun container.
Oct 10 05:45:13 np0005479823 podman[72501]: 2025-10-10 09:45:13.550137302 +0000 UTC m=+16.574366040 container init 063d54e38ca2a632e79dd75b4c61fe8deb94f9437b08bce0082b19a7a58a811f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nifty_knuth, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 05:45:13 np0005479823 podman[72501]: 2025-10-10 09:45:13.557425935 +0000 UTC m=+16.581654653 container start 063d54e38ca2a632e79dd75b4c61fe8deb94f9437b08bce0082b19a7a58a811f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nifty_knuth, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True)
Oct 10 05:45:13 np0005479823 podman[72501]: 2025-10-10 09:45:13.560501944 +0000 UTC m=+16.584730672 container attach 063d54e38ca2a632e79dd75b4c61fe8deb94f9437b08bce0082b19a7a58a811f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nifty_knuth, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 05:45:13 np0005479823 nifty_knuth[72563]: 167 167
Oct 10 05:45:13 np0005479823 systemd[1]: libpod-063d54e38ca2a632e79dd75b4c61fe8deb94f9437b08bce0082b19a7a58a811f.scope: Deactivated successfully.
Oct 10 05:45:13 np0005479823 podman[72501]: 2025-10-10 09:45:13.563618823 +0000 UTC m=+16.587847541 container died 063d54e38ca2a632e79dd75b4c61fe8deb94f9437b08bce0082b19a7a58a811f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nifty_knuth, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Oct 10 05:45:13 np0005479823 systemd[1]: var-lib-containers-storage-overlay-b938549fcb43b9793c46950d25b4c88bafab5f35807fe92eca9a141e5222d6cf-merged.mount: Deactivated successfully.
Oct 10 05:45:13 np0005479823 podman[72501]: 2025-10-10 09:45:13.595309725 +0000 UTC m=+16.619538453 container remove 063d54e38ca2a632e79dd75b4c61fe8deb94f9437b08bce0082b19a7a58a811f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nifty_knuth, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Oct 10 05:45:13 np0005479823 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 05:45:13 np0005479823 systemd[1]: libpod-conmon-063d54e38ca2a632e79dd75b4c61fe8deb94f9437b08bce0082b19a7a58a811f.scope: Deactivated successfully.
Oct 10 05:45:13 np0005479823 podman[72588]: 2025-10-10 09:45:13.76729893 +0000 UTC m=+0.037989025 container create 155197f3a59bb340fa9c527981931ba3191417d37e92232720558892903ead04 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_carson, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325)
Oct 10 05:45:13 np0005479823 systemd[1]: Started libpod-conmon-155197f3a59bb340fa9c527981931ba3191417d37e92232720558892903ead04.scope.
Oct 10 05:45:13 np0005479823 systemd[1]: Started libcrun container.
Oct 10 05:45:13 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2209aabeb53b0d845c8439475b6048d147881ac5db3d429554d3f96c785643c/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:13 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a2209aabeb53b0d845c8439475b6048d147881ac5db3d429554d3f96c785643c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:13 np0005479823 podman[72588]: 2025-10-10 09:45:13.844902959 +0000 UTC m=+0.115593084 container init 155197f3a59bb340fa9c527981931ba3191417d37e92232720558892903ead04 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_carson, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Oct 10 05:45:13 np0005479823 podman[72588]: 2025-10-10 09:45:13.750489773 +0000 UTC m=+0.021179878 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:45:13 np0005479823 podman[72588]: 2025-10-10 09:45:13.856148118 +0000 UTC m=+0.126838203 container start 155197f3a59bb340fa9c527981931ba3191417d37e92232720558892903ead04 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_carson, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2)
Oct 10 05:45:13 np0005479823 podman[72588]: 2025-10-10 09:45:13.860994703 +0000 UTC m=+0.131684838 container attach 155197f3a59bb340fa9c527981931ba3191417d37e92232720558892903ead04 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_carson, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 05:45:14 np0005479823 loving_carson[72605]: [
Oct 10 05:45:14 np0005479823 loving_carson[72605]:    {
Oct 10 05:45:14 np0005479823 loving_carson[72605]:        "available": false,
Oct 10 05:45:14 np0005479823 loving_carson[72605]:        "being_replaced": false,
Oct 10 05:45:14 np0005479823 loving_carson[72605]:        "ceph_device_lvm": false,
Oct 10 05:45:14 np0005479823 loving_carson[72605]:        "device_id": "QEMU_DVD-ROM_QM00001",
Oct 10 05:45:14 np0005479823 loving_carson[72605]:        "lsm_data": {},
Oct 10 05:45:14 np0005479823 loving_carson[72605]:        "lvs": [],
Oct 10 05:45:14 np0005479823 loving_carson[72605]:        "path": "/dev/sr0",
Oct 10 05:45:14 np0005479823 loving_carson[72605]:        "rejected_reasons": [
Oct 10 05:45:14 np0005479823 loving_carson[72605]:            "Insufficient space (<5GB)",
Oct 10 05:45:14 np0005479823 loving_carson[72605]:            "Has a FileSystem"
Oct 10 05:45:14 np0005479823 loving_carson[72605]:        ],
Oct 10 05:45:14 np0005479823 loving_carson[72605]:        "sys_api": {
Oct 10 05:45:14 np0005479823 loving_carson[72605]:            "actuators": null,
Oct 10 05:45:14 np0005479823 loving_carson[72605]:            "device_nodes": [
Oct 10 05:45:14 np0005479823 loving_carson[72605]:                "sr0"
Oct 10 05:45:14 np0005479823 loving_carson[72605]:            ],
Oct 10 05:45:14 np0005479823 loving_carson[72605]:            "devname": "sr0",
Oct 10 05:45:14 np0005479823 loving_carson[72605]:            "human_readable_size": "482.00 KB",
Oct 10 05:45:14 np0005479823 loving_carson[72605]:            "id_bus": "ata",
Oct 10 05:45:14 np0005479823 loving_carson[72605]:            "model": "QEMU DVD-ROM",
Oct 10 05:45:14 np0005479823 loving_carson[72605]:            "nr_requests": "2",
Oct 10 05:45:14 np0005479823 loving_carson[72605]:            "parent": "/dev/sr0",
Oct 10 05:45:14 np0005479823 loving_carson[72605]:            "partitions": {},
Oct 10 05:45:14 np0005479823 loving_carson[72605]:            "path": "/dev/sr0",
Oct 10 05:45:14 np0005479823 loving_carson[72605]:            "removable": "1",
Oct 10 05:45:14 np0005479823 loving_carson[72605]:            "rev": "2.5+",
Oct 10 05:45:14 np0005479823 loving_carson[72605]:            "ro": "0",
Oct 10 05:45:14 np0005479823 loving_carson[72605]:            "rotational": "0",
Oct 10 05:45:14 np0005479823 loving_carson[72605]:            "sas_address": "",
Oct 10 05:45:14 np0005479823 loving_carson[72605]:            "sas_device_handle": "",
Oct 10 05:45:14 np0005479823 loving_carson[72605]:            "scheduler_mode": "mq-deadline",
Oct 10 05:45:14 np0005479823 loving_carson[72605]:            "sectors": 0,
Oct 10 05:45:14 np0005479823 loving_carson[72605]:            "sectorsize": "2048",
Oct 10 05:45:14 np0005479823 loving_carson[72605]:            "size": 493568.0,
Oct 10 05:45:14 np0005479823 loving_carson[72605]:            "support_discard": "2048",
Oct 10 05:45:14 np0005479823 loving_carson[72605]:            "type": "disk",
Oct 10 05:45:14 np0005479823 loving_carson[72605]:            "vendor": "QEMU"
Oct 10 05:45:14 np0005479823 loving_carson[72605]:        }
Oct 10 05:45:14 np0005479823 loving_carson[72605]:    }
Oct 10 05:45:14 np0005479823 loving_carson[72605]: ]
Oct 10 05:45:14 np0005479823 systemd[1]: libpod-155197f3a59bb340fa9c527981931ba3191417d37e92232720558892903ead04.scope: Deactivated successfully.
Oct 10 05:45:14 np0005479823 podman[73561]: 2025-10-10 09:45:14.608746322 +0000 UTC m=+0.022106257 container died 155197f3a59bb340fa9c527981931ba3191417d37e92232720558892903ead04 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_carson, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Oct 10 05:45:14 np0005479823 systemd[1]: var-lib-containers-storage-overlay-a2209aabeb53b0d845c8439475b6048d147881ac5db3d429554d3f96c785643c-merged.mount: Deactivated successfully.
Oct 10 05:45:14 np0005479823 podman[73561]: 2025-10-10 09:45:14.640974532 +0000 UTC m=+0.054334467 container remove 155197f3a59bb340fa9c527981931ba3191417d37e92232720558892903ead04 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_carson, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 05:45:14 np0005479823 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 05:45:14 np0005479823 systemd[1]: libpod-conmon-155197f3a59bb340fa9c527981931ba3191417d37e92232720558892903ead04.scope: Deactivated successfully.
Oct 10 05:45:17 np0005479823 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 05:45:17 np0005479823 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 05:45:17 np0005479823 podman[74564]: 2025-10-10 09:45:17.498274215 +0000 UTC m=+0.021855909 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:45:17 np0005479823 podman[74564]: 2025-10-10 09:45:17.630903493 +0000 UTC m=+0.154485187 container create df9117c0866243db3256fee42b8198fd4fa8a3cef7075e26f6d7048e731eabd7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_pare, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 05:45:17 np0005479823 systemd[1]: Started libpod-conmon-df9117c0866243db3256fee42b8198fd4fa8a3cef7075e26f6d7048e731eabd7.scope.
Oct 10 05:45:17 np0005479823 systemd[1]: Started libcrun container.
Oct 10 05:45:17 np0005479823 podman[74564]: 2025-10-10 09:45:17.889085181 +0000 UTC m=+0.412666885 container init df9117c0866243db3256fee42b8198fd4fa8a3cef7075e26f6d7048e731eabd7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_pare, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 05:45:17 np0005479823 podman[74564]: 2025-10-10 09:45:17.895624279 +0000 UTC m=+0.419205973 container start df9117c0866243db3256fee42b8198fd4fa8a3cef7075e26f6d7048e731eabd7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_pare, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Oct 10 05:45:17 np0005479823 podman[74564]: 2025-10-10 09:45:17.898992357 +0000 UTC m=+0.422574051 container attach df9117c0866243db3256fee42b8198fd4fa8a3cef7075e26f6d7048e731eabd7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_pare, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True)
Oct 10 05:45:17 np0005479823 zen_pare[74580]: 167 167
Oct 10 05:45:17 np0005479823 podman[74564]: 2025-10-10 09:45:17.901736775 +0000 UTC m=+0.425318459 container died df9117c0866243db3256fee42b8198fd4fa8a3cef7075e26f6d7048e731eabd7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_pare, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 05:45:17 np0005479823 systemd[1]: libpod-df9117c0866243db3256fee42b8198fd4fa8a3cef7075e26f6d7048e731eabd7.scope: Deactivated successfully.
Oct 10 05:45:17 np0005479823 podman[74564]: 2025-10-10 09:45:17.934405468 +0000 UTC m=+0.457987162 container remove df9117c0866243db3256fee42b8198fd4fa8a3cef7075e26f6d7048e731eabd7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_pare, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS)
Oct 10 05:45:17 np0005479823 systemd[1]: libpod-conmon-df9117c0866243db3256fee42b8198fd4fa8a3cef7075e26f6d7048e731eabd7.scope: Deactivated successfully.
Oct 10 05:45:17 np0005479823 podman[74598]: 2025-10-10 09:45:17.995947394 +0000 UTC m=+0.039071669 container create 05b994d81c1e03ab7e56f7cd80ac4aebb5ae42b118da74d15866932a538d2481 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_dubinsky, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325)
Oct 10 05:45:18 np0005479823 systemd[1]: Started libpod-conmon-05b994d81c1e03ab7e56f7cd80ac4aebb5ae42b118da74d15866932a538d2481.scope.
Oct 10 05:45:18 np0005479823 systemd[1]: Started libcrun container.
Oct 10 05:45:18 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/efbe33c476a5d83bf99a66d28abe443a4190b0a08613813527c84caf2097cfba/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:18 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/efbe33c476a5d83bf99a66d28abe443a4190b0a08613813527c84caf2097cfba/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:18 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/efbe33c476a5d83bf99a66d28abe443a4190b0a08613813527c84caf2097cfba/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:18 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/efbe33c476a5d83bf99a66d28abe443a4190b0a08613813527c84caf2097cfba/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:18 np0005479823 podman[74598]: 2025-10-10 09:45:18.055782756 +0000 UTC m=+0.098907061 container init 05b994d81c1e03ab7e56f7cd80ac4aebb5ae42b118da74d15866932a538d2481 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_dubinsky, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Oct 10 05:45:18 np0005479823 podman[74598]: 2025-10-10 09:45:18.061636493 +0000 UTC m=+0.104760768 container start 05b994d81c1e03ab7e56f7cd80ac4aebb5ae42b118da74d15866932a538d2481 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_dubinsky, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Oct 10 05:45:18 np0005479823 podman[74598]: 2025-10-10 09:45:18.06496161 +0000 UTC m=+0.108085905 container attach 05b994d81c1e03ab7e56f7cd80ac4aebb5ae42b118da74d15866932a538d2481 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_dubinsky, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 10 05:45:18 np0005479823 podman[74598]: 2025-10-10 09:45:17.978703134 +0000 UTC m=+0.021827439 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:45:18 np0005479823 systemd[1]: libpod-05b994d81c1e03ab7e56f7cd80ac4aebb5ae42b118da74d15866932a538d2481.scope: Deactivated successfully.
Oct 10 05:45:18 np0005479823 podman[74598]: 2025-10-10 09:45:18.120413171 +0000 UTC m=+0.163537456 container died 05b994d81c1e03ab7e56f7cd80ac4aebb5ae42b118da74d15866932a538d2481 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_dubinsky, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1)
Oct 10 05:45:18 np0005479823 podman[74598]: 2025-10-10 09:45:18.15294748 +0000 UTC m=+0.196071755 container remove 05b994d81c1e03ab7e56f7cd80ac4aebb5ae42b118da74d15866932a538d2481 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=peaceful_dubinsky, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 10 05:45:18 np0005479823 systemd[1]: libpod-conmon-05b994d81c1e03ab7e56f7cd80ac4aebb5ae42b118da74d15866932a538d2481.scope: Deactivated successfully.
Oct 10 05:45:18 np0005479823 systemd[1]: Reloading.
Oct 10 05:45:18 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:45:18 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:45:18 np0005479823 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 05:45:18 np0005479823 systemd[1]: Reloading.
Oct 10 05:45:18 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:45:18 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:45:18 np0005479823 systemd[1]: Reached target All Ceph clusters and services.
Oct 10 05:45:18 np0005479823 systemd[1]: Reloading.
Oct 10 05:45:18 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:45:18 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:45:18 np0005479823 systemd[1]: Reached target Ceph cluster 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 05:45:18 np0005479823 systemd[1]: Reloading.
Oct 10 05:45:19 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:45:19 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:45:19 np0005479823 systemd[1]: Reloading.
Oct 10 05:45:19 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:45:19 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:45:19 np0005479823 systemd[1]: Created slice Slice /system/ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 05:45:19 np0005479823 systemd[1]: Reached target System Time Set.
Oct 10 05:45:19 np0005479823 systemd[1]: Reached target System Time Synchronized.
Oct 10 05:45:19 np0005479823 systemd[1]: Starting Ceph mon.compute-2 for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 05:45:19 np0005479823 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 05:45:19 np0005479823 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 10 05:45:19 np0005479823 podman[74893]: 2025-10-10 09:45:19.704777167 +0000 UTC m=+0.040439083 container create bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 10 05:45:19 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c31ac3b2f06a3f3ca385e8b0f02e9fc3131446e2d14a4a8898eb05888cb131d2/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:19 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c31ac3b2f06a3f3ca385e8b0f02e9fc3131446e2d14a4a8898eb05888cb131d2/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:19 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c31ac3b2f06a3f3ca385e8b0f02e9fc3131446e2d14a4a8898eb05888cb131d2/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:19 np0005479823 podman[74893]: 2025-10-10 09:45:19.773468472 +0000 UTC m=+0.109130388 container init bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 05:45:19 np0005479823 podman[74893]: 2025-10-10 09:45:19.77871997 +0000 UTC m=+0.114381856 container start bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Oct 10 05:45:19 np0005479823 bash[74893]: bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd
Oct 10 05:45:19 np0005479823 podman[74893]: 2025-10-10 09:45:19.686395971 +0000 UTC m=+0.022057877 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:45:19 np0005479823 systemd[1]: Started Ceph mon.compute-2 for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: set uid:gid to 167:167 (ceph:ceph)
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mon, pid 2
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: pidfile_write: ignore empty --pid-file
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: load: jerasure load: lrc 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: RocksDB version: 7.9.2
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: Git sha 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: Compile date 2025-07-17 03:12:14
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: DB SUMMARY
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: DB Session ID:  2V808MJHDIXUCLJZ1TSV
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: CURRENT file:  CURRENT
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: IDENTITY file:  IDENTITY
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-2/store.db dir, Total Num: 0, files: 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-2/store.db: 000004.log size: 511 ; 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                         Options.error_if_exists: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                       Options.create_if_missing: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                         Options.paranoid_checks: 1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:             Options.flush_verify_memtable_count: 1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                                     Options.env: 0x561619484c20
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                                      Options.fs: PosixFileSystem
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                                Options.info_log: 0x56161a93fa20
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                Options.max_file_opening_threads: 16
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                              Options.statistics: (nil)
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                               Options.use_fsync: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                       Options.max_log_file_size: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                   Options.log_file_time_to_roll: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                       Options.keep_log_file_num: 1000
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                    Options.recycle_log_file_num: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                         Options.allow_fallocate: 1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                        Options.allow_mmap_reads: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                       Options.allow_mmap_writes: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                        Options.use_direct_reads: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:          Options.create_missing_column_families: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                              Options.db_log_dir: 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                                 Options.wal_dir: 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                Options.table_cache_numshardbits: 6
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                         Options.WAL_ttl_seconds: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                       Options.WAL_size_limit_MB: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:             Options.manifest_preallocation_size: 4194304
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                     Options.is_fd_close_on_exec: 1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                   Options.advise_random_on_open: 1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                    Options.db_write_buffer_size: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                    Options.write_buffer_manager: 0x56161a943900
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:         Options.access_hint_on_compaction_start: 1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                      Options.use_adaptive_mutex: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                            Options.rate_limiter: (nil)
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                       Options.wal_recovery_mode: 2
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                  Options.enable_thread_tracking: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                  Options.enable_pipelined_write: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                  Options.unordered_write: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:             Options.write_thread_max_yield_usec: 100
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                               Options.row_cache: None
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                              Options.wal_filter: None
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:             Options.avoid_flush_during_recovery: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:             Options.allow_ingest_behind: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:             Options.two_write_queues: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:             Options.manual_wal_flush: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:             Options.wal_compression: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:             Options.atomic_flush: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                 Options.persist_stats_to_disk: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                 Options.write_dbid_to_manifest: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                 Options.log_readahead_size: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                 Options.best_efforts_recovery: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:             Options.allow_data_in_errors: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:             Options.db_host_id: __hostname__
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:             Options.enforce_single_del_contracts: true
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:             Options.max_background_jobs: 2
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:             Options.max_background_compactions: -1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:             Options.max_subcompactions: 1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:             Options.delayed_write_rate : 16777216
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:             Options.max_total_wal_size: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                   Options.stats_dump_period_sec: 600
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                 Options.stats_persist_period_sec: 600
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                          Options.max_open_files: -1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                          Options.bytes_per_sync: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                      Options.wal_bytes_per_sync: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                   Options.strict_bytes_per_sync: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:       Options.compaction_readahead_size: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                  Options.max_background_flushes: -1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: Compression algorithms supported:
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: #011kZSTD supported: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: #011kXpressCompression supported: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: #011kBZip2Compression supported: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: #011kLZ4Compression supported: 1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: #011kZlibCompression supported: 1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: #011kLZ4HCCompression supported: 1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: #011kSnappyCompression supported: 1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: Fast CRC32 supported: Supported on x86
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: DMutex implementation: pthread_mutex_t
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:           Options.merge_operator: 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:        Options.compaction_filter: None
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x56161a93e5c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x56161a963350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:        Options.write_buffer_size: 33554432
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:  Options.max_write_buffer_number: 2
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:          Options.compression: NoCompression
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:             Options.num_levels: 7
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                           Options.bloom_locality: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                               Options.ttl: 2592000
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                       Options.enable_blob_files: false
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                           Options.min_blob_size: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: c3989026-94dc-41dd-a555-ef3b3fd6f1b8
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089519820949, "job": 1, "event": "recovery_started", "wal_files": [4]}
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089519822758, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1648, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089519822895, "job": 1, "event": "recovery_finished"}
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x56161a964e00
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: DB pointer 0x56161aa6e000
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: mon.compute-2 does not exist in monmap, will attempt to join an existing cluster
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      1/0    1.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Sum      1/0    1.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.14 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.14 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x56161a963350#2 capacity: 512.00 MB usage: 0.86 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 2.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1,0.64 KB,0.00012219%) FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: using public_addr v2:192.168.122.102:0/0 -> [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0]
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: starting mon.compute-2 rank -1 at public addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] at bind addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-2 fsid 21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: mon.compute-2@-1(???) e0 preinit fsid 21f084a3-af34-5230-afe4-ea5cd24a55f4
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: mon.compute-2@-1(synchronizing).mds e1 new map
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: mon.compute-2@-1(synchronizing).mds e1 print_map#012e1#012btime 2025-10-10T09:43:15:731413+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: -1#012 #012No filesystems configured
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e8 e8: 2 total, 0 up, 2 in
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e9 e9: 2 total, 0 up, 2 in
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e10 e10: 2 total, 1 up, 2 in
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e11 e11: 2 total, 1 up, 2 in
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e12 e12: 2 total, 1 up, 2 in
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e13 e13: 2 total, 2 up, 2 in
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e14 e14: 2 total, 2 up, 2 in
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e15 e15: 2 total, 2 up, 2 in
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e15 crush map has features 3314933000852226048, adjusting msgr requires
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e15 crush map has features 288514051259236352, adjusting msgr requires
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e15 crush map has features 288514051259236352, adjusting msgr requires
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: mon.compute-2@-1(synchronizing).osd e15 crush map has features 288514051259236352, adjusting msgr requires
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: Failed to apply mon spec MONSpec.from_json(yaml.safe_load('''service_type: mon#012service_name: mon#012placement:#012  hosts:#012  - compute-0#012  - compute-1#012  - compute-2#012''')): Cannot place <MONSpec for service_name=mon> on compute-2: Unknown hosts
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: Failed to apply mgr spec ServiceSpec.from_json(yaml.safe_load('''service_type: mgr#012service_name: mgr#012placement:#012  hosts:#012  - compute-0#012  - compute-1#012  - compute-2#012''')): Cannot place <ServiceSpec for service_name=mgr> on compute-2: Unknown hosts
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: Deploying daemon crash.compute-1 on compute-1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: Health check failed: Failed to apply 2 service(s): mon,mgr (CEPHADM_APPLY_SPEC_FAIL)
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/4172963951' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "c307f4a4-39e7-4a9c-9d19-a2b8712089ab"}]: dispatch
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/4172963951' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "c307f4a4-39e7-4a9c-9d19-a2b8712089ab"}]': finished
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.101:0/234960172' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "aea3dcf0-efc7-4ff7-81f8-9509a806fb04"}]: dispatch
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.101:0/234960172' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "aea3dcf0-efc7-4ff7-81f8-9509a806fb04"}]': finished
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: Health check cleared: TOO_FEW_OSDS (was: OSD count 0 < osd_pool_default_size 1)
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: Deploying daemon osd.0 on compute-0
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: Deploying daemon osd.1 on compute-1
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='osd.0 [v2:192.168.122.100:6802/2298200206,v1:192.168.122.100:6803/2298200206]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='osd.0 [v2:192.168.122.100:6802/2298200206,v1:192.168.122.100:6803/2298200206]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='osd.0 [v2:192.168.122.100:6802/2298200206,v1:192.168.122.100:6803/2298200206]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]: dispatch
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='osd.0 [v2:192.168.122.100:6802/2298200206,v1:192.168.122.100:6803/2298200206]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=compute-0", "root=default"]}]': finished
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='osd.1 [v2:192.168.122.101:6800/2840395396,v1:192.168.122.101:6801/2840395396]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='osd.1 [v2:192.168.122.101:6800/2840395396,v1:192.168.122.101:6801/2840395396]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='osd.1 [v2:192.168.122.101:6800/2840395396,v1:192.168.122.101:6801/2840395396]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-1", "root=default"]}]: dispatch
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='osd.1 [v2:192.168.122.101:6800/2840395396,v1:192.168.122.101:6801/2840395396]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=compute-1", "root=default"]}]': finished
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: Adjusting osd_memory_target on compute-1 to  5248M
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: OSD bench result of 8693.274022 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: osd.0 [v2:192.168.122.100:6802/2298200206,v1:192.168.122.100:6803/2298200206] boot
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: Adjusting osd_memory_target on compute-0 to 128.0M
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: Unable to set osd_memory_target on compute-0 to 134240665: error parsing value: Value '134240665' is below minimum 939524096
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: OSD bench result of 2508.856277 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.1. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: osd.1 [v2:192.168.122.101:6800/2840395396,v1:192.168.122.101:6801/2840395396] boot
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: Updating compute-2:/etc/ceph/ceph.conf
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: Updating compute-2:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: Updating compute-2:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: Deploying daemon mon.compute-2 on compute-2
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 2 service(s): mon,mgr)
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: Cluster is now healthy
Oct 10 05:45:19 np0005479823 ceph-mon[74913]: mon.compute-2@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Oct 10 05:45:21 np0005479823 ceph-mon[74913]: mon.compute-2@-1(probing) e2  my rank is now 1 (was -1)
Oct 10 05:45:21 np0005479823 ceph-mon[74913]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Oct 10 05:45:21 np0005479823 ceph-mon[74913]: paxos.1).electionLogic(0) init, first boot, initializing epoch at 1 
Oct 10 05:45:21 np0005479823 ceph-mon[74913]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Oct 10 05:45:22 np0005479823 ceph-mon[74913]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Oct 10 05:45:22 np0005479823 ceph-mon[74913]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Oct 10 05:45:24 np0005479823 ceph-mon[74913]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Oct 10 05:45:24 np0005479823 ceph-mon[74913]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Oct 10 05:45:24 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Oct 10 05:45:24 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout,16=squid ondisk layout}
Oct 10 05:45:24 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:24 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:24 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:24 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Oct 10 05:45:24 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Oct 10 05:45:24 np0005479823 ceph-mon[74913]: mgrc update_daemon_metadata mon.compute-2 metadata {addrs=[v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0],arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-2,kernel_description=#1 SMP PREEMPT_DYNAMIC Tue Sep 30 07:37:35 UTC 2025,kernel_version=5.14.0-621.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864356,os=Linux}
Oct 10 05:45:25 np0005479823 ceph-mon[74913]: Deploying daemon mon.compute-1 on compute-1
Oct 10 05:45:25 np0005479823 ceph-mon[74913]: mon.compute-0 calling monitor election
Oct 10 05:45:25 np0005479823 ceph-mon[74913]: mon.compute-2 calling monitor election
Oct 10 05:45:25 np0005479823 ceph-mon[74913]: mon.compute-0 is new leader, mons compute-0,compute-2 in quorum (ranks 0,1)
Oct 10 05:45:25 np0005479823 ceph-mon[74913]: overall HEALTH_OK
Oct 10 05:45:25 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:25 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:25 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:25 np0005479823 podman[75042]: 2025-10-10 09:45:25.549977327 +0000 UTC m=+0.037306033 container create ba940501a2f5c70795046ef6531e27621a172f7ff10a9397e08aa6c2d13cf89f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=compassionate_galois, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 05:45:25 np0005479823 systemd[1]: Started libpod-conmon-ba940501a2f5c70795046ef6531e27621a172f7ff10a9397e08aa6c2d13cf89f.scope.
Oct 10 05:45:25 np0005479823 systemd[1]: Started libcrun container.
Oct 10 05:45:25 np0005479823 podman[75042]: 2025-10-10 09:45:25.532706575 +0000 UTC m=+0.020035281 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:45:25 np0005479823 podman[75042]: 2025-10-10 09:45:25.633095332 +0000 UTC m=+0.120424048 container init ba940501a2f5c70795046ef6531e27621a172f7ff10a9397e08aa6c2d13cf89f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=compassionate_galois, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 05:45:25 np0005479823 podman[75042]: 2025-10-10 09:45:25.639993603 +0000 UTC m=+0.127322299 container start ba940501a2f5c70795046ef6531e27621a172f7ff10a9397e08aa6c2d13cf89f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=compassionate_galois, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 05:45:25 np0005479823 podman[75042]: 2025-10-10 09:45:25.644195066 +0000 UTC m=+0.131523762 container attach ba940501a2f5c70795046ef6531e27621a172f7ff10a9397e08aa6c2d13cf89f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=compassionate_galois, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 05:45:25 np0005479823 compassionate_galois[75058]: 167 167
Oct 10 05:45:25 np0005479823 systemd[1]: libpod-ba940501a2f5c70795046ef6531e27621a172f7ff10a9397e08aa6c2d13cf89f.scope: Deactivated successfully.
Oct 10 05:45:25 np0005479823 podman[75042]: 2025-10-10 09:45:25.647301886 +0000 UTC m=+0.134630582 container died ba940501a2f5c70795046ef6531e27621a172f7ff10a9397e08aa6c2d13cf89f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=compassionate_galois, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_REF=squid)
Oct 10 05:45:25 np0005479823 systemd[1]: var-lib-containers-storage-overlay-b1ce5ac7e351281de10c42fcb7fdce4ab56a5fdfe2beeb7c1660adf9d0967d1b-merged.mount: Deactivated successfully.
Oct 10 05:45:25 np0005479823 podman[75042]: 2025-10-10 09:45:25.690953731 +0000 UTC m=+0.178282427 container remove ba940501a2f5c70795046ef6531e27621a172f7ff10a9397e08aa6c2d13cf89f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=compassionate_galois, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.40.1)
Oct 10 05:45:25 np0005479823 systemd[1]: libpod-conmon-ba940501a2f5c70795046ef6531e27621a172f7ff10a9397e08aa6c2d13cf89f.scope: Deactivated successfully.
Oct 10 05:45:25 np0005479823 systemd[1]: Reloading.
Oct 10 05:45:25 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:45:25 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:45:25 np0005479823 systemd[1]: Reloading.
Oct 10 05:45:26 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:26 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.gkrssp", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Oct 10 05:45:26 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.gkrssp", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Oct 10 05:45:26 np0005479823 ceph-mon[74913]: Deploying daemon mgr.compute-2.gkrssp on compute-2
Oct 10 05:45:26 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:45:26 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:45:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Oct 10 05:45:26 np0005479823 ceph-mon[74913]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Oct 10 05:45:26 np0005479823 ceph-mon[74913]: paxos.1).electionLogic(10) init, last seen epoch 10
Oct 10 05:45:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Oct 10 05:45:26 np0005479823 systemd[1]: Starting Ceph mgr.compute-2.gkrssp for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 05:45:26 np0005479823 podman[75199]: 2025-10-10 09:45:26.462093766 +0000 UTC m=+0.045947218 container create 04def5c470185e333ff2788fce44cd382250a90c2fb8289f5f3139b45bba29d4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp, ceph=True, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid)
Oct 10 05:45:26 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9c4da16a382606adfde96f4085a30764f5f7ae97af0986785fc6e0b8502eedd/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:26 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9c4da16a382606adfde96f4085a30764f5f7ae97af0986785fc6e0b8502eedd/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:26 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9c4da16a382606adfde96f4085a30764f5f7ae97af0986785fc6e0b8502eedd/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:26 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9c4da16a382606adfde96f4085a30764f5f7ae97af0986785fc6e0b8502eedd/merged/var/lib/ceph/mgr/ceph-compute-2.gkrssp supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:26 np0005479823 podman[75199]: 2025-10-10 09:45:26.5317034 +0000 UTC m=+0.115556862 container init 04def5c470185e333ff2788fce44cd382250a90c2fb8289f5f3139b45bba29d4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 05:45:26 np0005479823 podman[75199]: 2025-10-10 09:45:26.441567641 +0000 UTC m=+0.025421113 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:45:26 np0005479823 podman[75199]: 2025-10-10 09:45:26.53949477 +0000 UTC m=+0.123348222 container start 04def5c470185e333ff2788fce44cd382250a90c2fb8289f5f3139b45bba29d4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Oct 10 05:45:26 np0005479823 bash[75199]: 04def5c470185e333ff2788fce44cd382250a90c2fb8289f5f3139b45bba29d4
Oct 10 05:45:26 np0005479823 systemd[1]: Started Ceph mgr.compute-2.gkrssp for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 05:45:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Oct 10 05:45:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Oct 10 05:45:27 np0005479823 ceph-mon[74913]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Oct 10 05:45:27 np0005479823 ceph-mon[74913]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Oct 10 05:45:29 np0005479823 ceph-mon[74913]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Oct 10 05:45:29 np0005479823 ceph-mon[74913]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Oct 10 05:45:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Oct 10 05:45:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Oct 10 05:45:31 np0005479823 ceph-mon[74913]: paxos.1).electionLogic(11) init, last seen epoch 11, mid-election, bumping
Oct 10 05:45:31 np0005479823 ceph-mon[74913]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Oct 10 05:45:31 np0005479823 ceph-mon[74913]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Oct 10 05:45:31 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Oct 10 05:45:31 np0005479823 ceph-mon[74913]: mon.compute-0 calling monitor election
Oct 10 05:45:31 np0005479823 ceph-mon[74913]: mon.compute-2 calling monitor election
Oct 10 05:45:31 np0005479823 ceph-mon[74913]: mon.compute-1 calling monitor election
Oct 10 05:45:31 np0005479823 ceph-mon[74913]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Oct 10 05:45:31 np0005479823 ceph-mon[74913]: overall HEALTH_OK
Oct 10 05:45:31 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:31 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:31 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:31 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:31 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.rfugxc", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Oct 10 05:45:32 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.rfugxc", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Oct 10 05:45:32 np0005479823 ceph-mon[74913]: Deploying daemon mgr.compute-1.rfugxc on compute-1
Oct 10 05:45:32 np0005479823 ceph-mgr[75218]: set uid:gid to 167:167 (ceph:ceph)
Oct 10 05:45:32 np0005479823 ceph-mgr[75218]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Oct 10 05:45:32 np0005479823 ceph-mgr[75218]: pidfile_write: ignore empty --pid-file
Oct 10 05:45:32 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'alerts'
Oct 10 05:45:32 np0005479823 ceph-mgr[75218]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Oct 10 05:45:32 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'balancer'
Oct 10 05:45:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:32.755+0000 7f4b1c081140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Oct 10 05:45:32 np0005479823 ceph-mgr[75218]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Oct 10 05:45:32 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'cephadm'
Oct 10 05:45:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:32.845+0000 7f4b1c081140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Oct 10 05:45:33 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/3667835426' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Oct 10 05:45:33 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:33 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:33 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:33 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:33 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Oct 10 05:45:33 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Oct 10 05:45:33 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e16 e16: 2 total, 2 up, 2 in
Oct 10 05:45:33 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'crash'
Oct 10 05:45:33 np0005479823 ceph-mgr[75218]: mgr[py] Module crash has missing NOTIFY_TYPES member
Oct 10 05:45:33 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'dashboard'
Oct 10 05:45:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:33.671+0000 7f4b1c081140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Oct 10 05:45:33 np0005479823 podman[75342]: 2025-10-10 09:45:33.835255371 +0000 UTC m=+0.045040780 container create b4781a319cb674c24f0888d5e2cb80795be31fc69ca1e3c9438c2217b16a259b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_dewdney, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Oct 10 05:45:33 np0005479823 systemd[1]: Started libpod-conmon-b4781a319cb674c24f0888d5e2cb80795be31fc69ca1e3c9438c2217b16a259b.scope.
Oct 10 05:45:33 np0005479823 systemd[1]: Started libcrun container.
Oct 10 05:45:33 np0005479823 podman[75342]: 2025-10-10 09:45:33.815626434 +0000 UTC m=+0.025411863 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:45:33 np0005479823 podman[75342]: 2025-10-10 09:45:33.92071138 +0000 UTC m=+0.130496779 container init b4781a319cb674c24f0888d5e2cb80795be31fc69ca1e3c9438c2217b16a259b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_dewdney, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 05:45:33 np0005479823 podman[75342]: 2025-10-10 09:45:33.92916201 +0000 UTC m=+0.138947389 container start b4781a319cb674c24f0888d5e2cb80795be31fc69ca1e3c9438c2217b16a259b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_dewdney, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325)
Oct 10 05:45:33 np0005479823 podman[75342]: 2025-10-10 09:45:33.932955752 +0000 UTC m=+0.142741141 container attach b4781a319cb674c24f0888d5e2cb80795be31fc69ca1e3c9438c2217b16a259b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_dewdney, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Oct 10 05:45:33 np0005479823 elated_dewdney[75360]: 167 167
Oct 10 05:45:33 np0005479823 systemd[1]: libpod-b4781a319cb674c24f0888d5e2cb80795be31fc69ca1e3c9438c2217b16a259b.scope: Deactivated successfully.
Oct 10 05:45:33 np0005479823 podman[75342]: 2025-10-10 09:45:33.938046474 +0000 UTC m=+0.147831853 container died b4781a319cb674c24f0888d5e2cb80795be31fc69ca1e3c9438c2217b16a259b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_dewdney, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 05:45:33 np0005479823 systemd[1]: var-lib-containers-storage-overlay-f5d969ae85a423e47d1305a6d8579b77154899f5c8c0e93b7c5fc1ca28b824b1-merged.mount: Deactivated successfully.
Oct 10 05:45:33 np0005479823 podman[75342]: 2025-10-10 09:45:33.988904889 +0000 UTC m=+0.198690268 container remove b4781a319cb674c24f0888d5e2cb80795be31fc69ca1e3c9438c2217b16a259b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_dewdney, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1)
Oct 10 05:45:33 np0005479823 systemd[1]: libpod-conmon-b4781a319cb674c24f0888d5e2cb80795be31fc69ca1e3c9438c2217b16a259b.scope: Deactivated successfully.
Oct 10 05:45:34 np0005479823 systemd[1]: Reloading.
Oct 10 05:45:34 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:45:34 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:45:34 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'devicehealth'
Oct 10 05:45:34 np0005479823 systemd[1]: Reloading.
Oct 10 05:45:34 np0005479823 ceph-mon[74913]: Deploying daemon crash.compute-2 on compute-2
Oct 10 05:45:34 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/3667835426' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Oct 10 05:45:34 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/3269086226' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Oct 10 05:45:34 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e17 e17: 2 total, 2 up, 2 in
Oct 10 05:45:34 np0005479823 ceph-mgr[75218]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Oct 10 05:45:34 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'diskprediction_local'
Oct 10 05:45:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:34.375+0000 7f4b1c081140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Oct 10 05:45:34 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:45:34 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:45:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Oct 10 05:45:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Oct 10 05:45:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]:  from numpy import show_config as show_numpy_config
Oct 10 05:45:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:34.550+0000 7f4b1c081140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Oct 10 05:45:34 np0005479823 ceph-mgr[75218]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Oct 10 05:45:34 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'influx'
Oct 10 05:45:34 np0005479823 systemd[1]: Starting Ceph crash.compute-2 for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 05:45:34 np0005479823 ceph-mgr[75218]: mgr[py] Module influx has missing NOTIFY_TYPES member
Oct 10 05:45:34 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'insights'
Oct 10 05:45:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:34.623+0000 7f4b1c081140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Oct 10 05:45:34 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'iostat'
Oct 10 05:45:34 np0005479823 ceph-mgr[75218]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Oct 10 05:45:34 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'k8sevents'
Oct 10 05:45:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:34.760+0000 7f4b1c081140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Oct 10 05:45:34 np0005479823 podman[75504]: 2025-10-10 09:45:34.836410525 +0000 UTC m=+0.047845000 container create e6626ca9d8bcc16a7f77c3eb4e12186e85303a5987606787a8b2590756016ba3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, ceph=True)
Oct 10 05:45:34 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e17 _set_new_cache_sizes cache_size:1019932584 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:45:34 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d7303a7ba1602d1ff1915d9a031f0e5c69836bebaa5a6751cedc86b4b83fa65/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:34 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d7303a7ba1602d1ff1915d9a031f0e5c69836bebaa5a6751cedc86b4b83fa65/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:34 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d7303a7ba1602d1ff1915d9a031f0e5c69836bebaa5a6751cedc86b4b83fa65/merged/etc/ceph/ceph.client.crash.compute-2.keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:34 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d7303a7ba1602d1ff1915d9a031f0e5c69836bebaa5a6751cedc86b4b83fa65/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:34 np0005479823 podman[75504]: 2025-10-10 09:45:34.815292411 +0000 UTC m=+0.026726886 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:45:34 np0005479823 podman[75504]: 2025-10-10 09:45:34.924728057 +0000 UTC m=+0.136162592 container init e6626ca9d8bcc16a7f77c3eb4e12186e85303a5987606787a8b2590756016ba3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1)
Oct 10 05:45:34 np0005479823 podman[75504]: 2025-10-10 09:45:34.931346728 +0000 UTC m=+0.142781203 container start e6626ca9d8bcc16a7f77c3eb4e12186e85303a5987606787a8b2590756016ba3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 05:45:34 np0005479823 bash[75504]: e6626ca9d8bcc16a7f77c3eb4e12186e85303a5987606787a8b2590756016ba3
Oct 10 05:45:34 np0005479823 systemd[1]: Started Ceph crash.compute-2 for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 05:45:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: INFO:ceph-crash:pinging cluster to exercise our key
Oct 10 05:45:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: 2025-10-10T09:45:35.108+0000 7fbc726b8640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Oct 10 05:45:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: 2025-10-10T09:45:35.108+0000 7fbc726b8640 -1 AuthRegistry(0x7fbc6c0696b0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Oct 10 05:45:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: 2025-10-10T09:45:35.109+0000 7fbc726b8640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Oct 10 05:45:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: 2025-10-10T09:45:35.109+0000 7fbc726b8640 -1 AuthRegistry(0x7fbc726b6ff0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Oct 10 05:45:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: 2025-10-10T09:45:35.110+0000 7fbc70c2e640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Oct 10 05:45:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: 2025-10-10T09:45:35.111+0000 7fbc6b7fe640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Oct 10 05:45:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: 2025-10-10T09:45:35.112+0000 7fbc6bfff640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Oct 10 05:45:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: 2025-10-10T09:45:35.112+0000 7fbc726b8640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Oct 10 05:45:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: [errno 13] RADOS permission denied (error connecting to the cluster)
Oct 10 05:45:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Oct 10 05:45:35 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'localpool'
Oct 10 05:45:35 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'mds_autoscaler'
Oct 10 05:45:35 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/3269086226' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Oct 10 05:45:35 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:35 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:35 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:35 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:35 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 05:45:35 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 05:45:35 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/1727378227' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Oct 10 05:45:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e18 e18: 2 total, 2 up, 2 in
Oct 10 05:45:35 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'mirroring'
Oct 10 05:45:35 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'nfs'
Oct 10 05:45:35 np0005479823 podman[75628]: 2025-10-10 09:45:35.650803222 +0000 UTC m=+0.043562882 container create 224b3a3ba66bacd4b97b49724a85875855f8cc0ace019a05ed5f75b6550cc47e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_jones, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default)
Oct 10 05:45:35 np0005479823 systemd[1]: Started libpod-conmon-224b3a3ba66bacd4b97b49724a85875855f8cc0ace019a05ed5f75b6550cc47e.scope.
Oct 10 05:45:35 np0005479823 podman[75628]: 2025-10-10 09:45:35.632188918 +0000 UTC m=+0.024948618 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:45:35 np0005479823 systemd[1]: Started libcrun container.
Oct 10 05:45:35 np0005479823 podman[75628]: 2025-10-10 09:45:35.748582036 +0000 UTC m=+0.141341736 container init 224b3a3ba66bacd4b97b49724a85875855f8cc0ace019a05ed5f75b6550cc47e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_jones, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 05:45:35 np0005479823 podman[75628]: 2025-10-10 09:45:35.756345624 +0000 UTC m=+0.149105314 container start 224b3a3ba66bacd4b97b49724a85875855f8cc0ace019a05ed5f75b6550cc47e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_jones, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325)
Oct 10 05:45:35 np0005479823 podman[75628]: 2025-10-10 09:45:35.761272312 +0000 UTC m=+0.154031982 container attach 224b3a3ba66bacd4b97b49724a85875855f8cc0ace019a05ed5f75b6550cc47e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_jones, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 05:45:35 np0005479823 infallible_jones[75645]: 167 167
Oct 10 05:45:35 np0005479823 systemd[1]: libpod-224b3a3ba66bacd4b97b49724a85875855f8cc0ace019a05ed5f75b6550cc47e.scope: Deactivated successfully.
Oct 10 05:45:35 np0005479823 podman[75628]: 2025-10-10 09:45:35.763174253 +0000 UTC m=+0.155933913 container died 224b3a3ba66bacd4b97b49724a85875855f8cc0ace019a05ed5f75b6550cc47e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_jones, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=squid, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 05:45:35 np0005479823 systemd[1]: var-lib-containers-storage-overlay-fafe8ae54286663e98e3ac7b70335eae2f8c77dd16a7f5309b2faa66a02c505a-merged.mount: Deactivated successfully.
Oct 10 05:45:35 np0005479823 ceph-mgr[75218]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Oct 10 05:45:35 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'orchestrator'
Oct 10 05:45:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:35.795+0000 7f4b1c081140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Oct 10 05:45:35 np0005479823 podman[75628]: 2025-10-10 09:45:35.798231342 +0000 UTC m=+0.190991002 container remove 224b3a3ba66bacd4b97b49724a85875855f8cc0ace019a05ed5f75b6550cc47e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=infallible_jones, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Oct 10 05:45:35 np0005479823 systemd[1]: libpod-conmon-224b3a3ba66bacd4b97b49724a85875855f8cc0ace019a05ed5f75b6550cc47e.scope: Deactivated successfully.
Oct 10 05:45:35 np0005479823 podman[75669]: 2025-10-10 09:45:35.969294307 +0000 UTC m=+0.045926557 container create df7a01f707074bf942355f8ecb790223ec20609911b9454888ee19157bfca9ac (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vibrant_khorana, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, ceph=True, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 05:45:36 np0005479823 systemd[1]: Started libpod-conmon-df7a01f707074bf942355f8ecb790223ec20609911b9454888ee19157bfca9ac.scope.
Oct 10 05:45:36 np0005479823 ceph-mgr[75218]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Oct 10 05:45:36 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'osd_perf_query'
Oct 10 05:45:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:36.005+0000 7f4b1c081140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Oct 10 05:45:36 np0005479823 systemd[1]: Started libcrun container.
Oct 10 05:45:36 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb76ecaf72fb0897d828199edc1e984035c5564013a83481704aa19ca322213b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:36 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb76ecaf72fb0897d828199edc1e984035c5564013a83481704aa19ca322213b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:36 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb76ecaf72fb0897d828199edc1e984035c5564013a83481704aa19ca322213b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:36 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb76ecaf72fb0897d828199edc1e984035c5564013a83481704aa19ca322213b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:36 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb76ecaf72fb0897d828199edc1e984035c5564013a83481704aa19ca322213b/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:36 np0005479823 podman[75669]: 2025-10-10 09:45:35.952906274 +0000 UTC m=+0.029538544 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:45:36 np0005479823 podman[75669]: 2025-10-10 09:45:36.058202208 +0000 UTC m=+0.134834478 container init df7a01f707074bf942355f8ecb790223ec20609911b9454888ee19157bfca9ac (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vibrant_khorana, OSD_FLAVOR=default, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 05:45:36 np0005479823 podman[75669]: 2025-10-10 09:45:36.065382417 +0000 UTC m=+0.142014667 container start df7a01f707074bf942355f8ecb790223ec20609911b9454888ee19157bfca9ac (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vibrant_khorana, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 05:45:36 np0005479823 podman[75669]: 2025-10-10 09:45:36.069990064 +0000 UTC m=+0.146622314 container attach df7a01f707074bf942355f8ecb790223ec20609911b9454888ee19157bfca9ac (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vibrant_khorana, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 05:45:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:36.089+0000 7f4b1c081140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Oct 10 05:45:36 np0005479823 ceph-mgr[75218]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Oct 10 05:45:36 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'osd_support'
Oct 10 05:45:36 np0005479823 ceph-mgr[75218]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Oct 10 05:45:36 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'pg_autoscaler'
Oct 10 05:45:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:36.166+0000 7f4b1c081140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Oct 10 05:45:36 np0005479823 ceph-mgr[75218]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Oct 10 05:45:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:36.252+0000 7f4b1c081140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Oct 10 05:45:36 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'progress'
Oct 10 05:45:36 np0005479823 ceph-mgr[75218]: mgr[py] Module progress has missing NOTIFY_TYPES member
Oct 10 05:45:36 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'prometheus'
Oct 10 05:45:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:36.321+0000 7f4b1c081140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Oct 10 05:45:36 np0005479823 ceph-mon[74913]: Health check failed: 2 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Oct 10 05:45:36 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/1727378227' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Oct 10 05:45:36 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:36 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/1828731644' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Oct 10 05:45:36 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e19 e19: 2 total, 2 up, 2 in
Oct 10 05:45:36 np0005479823 vibrant_khorana[75686]: --> passed data devices: 0 physical, 1 LVM
Oct 10 05:45:36 np0005479823 vibrant_khorana[75686]: Running command: /usr/bin/ceph-authtool --gen-print-key
Oct 10 05:45:36 np0005479823 vibrant_khorana[75686]: Running command: /usr/bin/ceph-authtool --gen-print-key
Oct 10 05:45:36 np0005479823 vibrant_khorana[75686]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new fd47bcfa-dab9-466a-b4bb-0169e493040a
Oct 10 05:45:36 np0005479823 ceph-mgr[75218]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Oct 10 05:45:36 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'rbd_support'
Oct 10 05:45:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:36.696+0000 7f4b1c081140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Oct 10 05:45:36 np0005479823 ceph-mgr[75218]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Oct 10 05:45:36 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'restful'
Oct 10 05:45:36 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd new", "uuid": "fd47bcfa-dab9-466a-b4bb-0169e493040a"} v 0)
Oct 10 05:45:36 np0005479823 ceph-mon[74913]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/3277074974' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "fd47bcfa-dab9-466a-b4bb-0169e493040a"}]: dispatch
Oct 10 05:45:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:36.817+0000 7f4b1c081140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Oct 10 05:45:36 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e20 e20: 3 total, 2 up, 3 in
Oct 10 05:45:36 np0005479823 vibrant_khorana[75686]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-2
Oct 10 05:45:36 np0005479823 vibrant_khorana[75686]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Oct 10 05:45:36 np0005479823 lvm[75747]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Oct 10 05:45:36 np0005479823 lvm[75747]: VG ceph_vg0 finished
Oct 10 05:45:36 np0005479823 vibrant_khorana[75686]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Oct 10 05:45:36 np0005479823 vibrant_khorana[75686]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Oct 10 05:45:37 np0005479823 vibrant_khorana[75686]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-2/activate.monmap
Oct 10 05:45:37 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'rgw'
Oct 10 05:45:37 np0005479823 ceph-mgr[75218]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Oct 10 05:45:37 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'rook'
Oct 10 05:45:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:37.262+0000 7f4b1c081140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Oct 10 05:45:37 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/1828731644' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Oct 10 05:45:37 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.102:0/3277074974' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "fd47bcfa-dab9-466a-b4bb-0169e493040a"}]: dispatch
Oct 10 05:45:37 np0005479823 ceph-mon[74913]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "fd47bcfa-dab9-466a-b4bb-0169e493040a"}]: dispatch
Oct 10 05:45:37 np0005479823 ceph-mon[74913]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "fd47bcfa-dab9-466a-b4bb-0169e493040a"}]': finished
Oct 10 05:45:37 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/3839621145' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Oct 10 05:45:37 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon getmap"} v 0)
Oct 10 05:45:37 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1014583551' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Oct 10 05:45:37 np0005479823 vibrant_khorana[75686]: stderr: got monmap epoch 3
Oct 10 05:45:37 np0005479823 vibrant_khorana[75686]: --> Creating keyring file for osd.2
Oct 10 05:45:37 np0005479823 vibrant_khorana[75686]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/keyring
Oct 10 05:45:37 np0005479823 vibrant_khorana[75686]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/
Oct 10 05:45:37 np0005479823 vibrant_khorana[75686]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 2 --monmap /var/lib/ceph/osd/ceph-2/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-2/ --osd-uuid fd47bcfa-dab9-466a-b4bb-0169e493040a --setuser ceph --setgroup ceph
Oct 10 05:45:37 np0005479823 ceph-mgr[75218]: mgr[py] Module rook has missing NOTIFY_TYPES member
Oct 10 05:45:37 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'selftest'
Oct 10 05:45:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:37.835+0000 7f4b1c081140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Oct 10 05:45:37 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e21 e21: 3 total, 2 up, 3 in
Oct 10 05:45:37 np0005479823 ceph-mgr[75218]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Oct 10 05:45:37 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'snap_schedule'
Oct 10 05:45:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:37.907+0000 7f4b1c081140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Oct 10 05:45:37 np0005479823 ceph-mgr[75218]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Oct 10 05:45:37 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'stats'
Oct 10 05:45:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:37.988+0000 7f4b1c081140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Oct 10 05:45:38 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'status'
Oct 10 05:45:38 np0005479823 ceph-mgr[75218]: mgr[py] Module status has missing NOTIFY_TYPES member
Oct 10 05:45:38 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'telegraf'
Oct 10 05:45:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:38.139+0000 7f4b1c081140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Oct 10 05:45:38 np0005479823 ceph-mgr[75218]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Oct 10 05:45:38 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'telemetry'
Oct 10 05:45:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:38.211+0000 7f4b1c081140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Oct 10 05:45:38 np0005479823 ceph-mgr[75218]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Oct 10 05:45:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:38.358+0000 7f4b1c081140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Oct 10 05:45:38 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'test_orchestrator'
Oct 10 05:45:38 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/3839621145' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Oct 10 05:45:38 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]: dispatch
Oct 10 05:45:38 np0005479823 ceph-mgr[75218]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Oct 10 05:45:38 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'volumes'
Oct 10 05:45:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:38.573+0000 7f4b1c081140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Oct 10 05:45:38 np0005479823 ceph-mgr[75218]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Oct 10 05:45:38 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'zabbix'
Oct 10 05:45:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:38.843+0000 7f4b1c081140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Oct 10 05:45:38 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e22 e22: 3 total, 2 up, 3 in
Oct 10 05:45:38 np0005479823 ceph-mgr[75218]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Oct 10 05:45:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:45:38.920+0000 7f4b1c081140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Oct 10 05:45:38 np0005479823 ceph-mgr[75218]: ms_deliver_dispatch: unhandled message 0x55da60f76d00 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Oct 10 05:45:39 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/2251912187' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Oct 10 05:45:39 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Oct 10 05:45:39 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/2251912187' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Oct 10 05:45:39 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Oct 10 05:45:39 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct 10 05:45:39 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e22 _set_new_cache_sizes cache_size:1020053225 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:45:39 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e23 e23: 3 total, 2 up, 3 in
Oct 10 05:45:40 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/1271642618' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]: dispatch
Oct 10 05:45:40 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:40 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:40 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Oct 10 05:45:40 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Oct 10 05:45:40 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/1271642618' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]': finished
Oct 10 05:45:40 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Oct 10 05:45:40 np0005479823 vibrant_khorana[75686]: stderr: 2025-10-10T09:45:37.570+0000 7f21506e6740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) No valid bdev label found
Oct 10 05:45:40 np0005479823 vibrant_khorana[75686]: stderr: 2025-10-10T09:45:37.837+0000 7f21506e6740 -1 bluestore(/var/lib/ceph/osd/ceph-2/) _read_fsid unparsable uuid
Oct 10 05:45:40 np0005479823 vibrant_khorana[75686]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Oct 10 05:45:40 np0005479823 vibrant_khorana[75686]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Oct 10 05:45:40 np0005479823 vibrant_khorana[75686]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Oct 10 05:45:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e24 e24: 3 total, 2 up, 3 in
Oct 10 05:45:41 np0005479823 vibrant_khorana[75686]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Oct 10 05:45:41 np0005479823 vibrant_khorana[75686]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Oct 10 05:45:41 np0005479823 vibrant_khorana[75686]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Oct 10 05:45:41 np0005479823 vibrant_khorana[75686]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Oct 10 05:45:41 np0005479823 vibrant_khorana[75686]: --> ceph-volume lvm activate successful for osd ID: 2
Oct 10 05:45:41 np0005479823 vibrant_khorana[75686]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Oct 10 05:45:41 np0005479823 systemd[1]: libpod-df7a01f707074bf942355f8ecb790223ec20609911b9454888ee19157bfca9ac.scope: Deactivated successfully.
Oct 10 05:45:41 np0005479823 systemd[1]: libpod-df7a01f707074bf942355f8ecb790223ec20609911b9454888ee19157bfca9ac.scope: Consumed 1.964s CPU time.
Oct 10 05:45:41 np0005479823 podman[75669]: 2025-10-10 09:45:41.250103746 +0000 UTC m=+5.326736086 container died df7a01f707074bf942355f8ecb790223ec20609911b9454888ee19157bfca9ac (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vibrant_khorana, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 05:45:41 np0005479823 systemd[1]: var-lib-containers-storage-overlay-cb76ecaf72fb0897d828199edc1e984035c5564013a83481704aa19ca322213b-merged.mount: Deactivated successfully.
Oct 10 05:45:41 np0005479823 podman[75669]: 2025-10-10 09:45:41.306935802 +0000 UTC m=+5.383568062 container remove df7a01f707074bf942355f8ecb790223ec20609911b9454888ee19157bfca9ac (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vibrant_khorana, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325)
Oct 10 05:45:41 np0005479823 systemd[1]: libpod-conmon-df7a01f707074bf942355f8ecb790223ec20609911b9454888ee19157bfca9ac.scope: Deactivated successfully.
Oct 10 05:45:41 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/2550341542' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]: dispatch
Oct 10 05:45:41 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Oct 10 05:45:41 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/2550341542' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Oct 10 05:45:41 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Oct 10 05:45:41 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct 10 05:45:41 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct 10 05:45:41 np0005479823 podman[76768]: 2025-10-10 09:45:41.817122751 +0000 UTC m=+0.038708468 container create b87903639b01baf62aa914e6ecab6a10120e54d91c311c24662f7bd1d9c2f3e2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_golick, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.40.1)
Oct 10 05:45:41 np0005479823 systemd[1]: Started libpod-conmon-b87903639b01baf62aa914e6ecab6a10120e54d91c311c24662f7bd1d9c2f3e2.scope.
Oct 10 05:45:41 np0005479823 systemd[1]: Started libcrun container.
Oct 10 05:45:41 np0005479823 podman[76768]: 2025-10-10 09:45:41.79895823 +0000 UTC m=+0.020543967 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:45:41 np0005479823 podman[76768]: 2025-10-10 09:45:41.896095973 +0000 UTC m=+0.117681740 container init b87903639b01baf62aa914e6ecab6a10120e54d91c311c24662f7bd1d9c2f3e2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_golick, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct 10 05:45:41 np0005479823 podman[76768]: 2025-10-10 09:45:41.901799786 +0000 UTC m=+0.123385503 container start b87903639b01baf62aa914e6ecab6a10120e54d91c311c24662f7bd1d9c2f3e2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_golick, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 05:45:41 np0005479823 podman[76768]: 2025-10-10 09:45:41.904857764 +0000 UTC m=+0.126443481 container attach b87903639b01baf62aa914e6ecab6a10120e54d91c311c24662f7bd1d9c2f3e2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_golick, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Oct 10 05:45:41 np0005479823 kind_golick[76784]: 167 167
Oct 10 05:45:41 np0005479823 systemd[1]: libpod-b87903639b01baf62aa914e6ecab6a10120e54d91c311c24662f7bd1d9c2f3e2.scope: Deactivated successfully.
Oct 10 05:45:41 np0005479823 podman[76768]: 2025-10-10 09:45:41.9072689 +0000 UTC m=+0.128854647 container died b87903639b01baf62aa914e6ecab6a10120e54d91c311c24662f7bd1d9c2f3e2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_golick, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 05:45:41 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e25 e25: 3 total, 2 up, 3 in
Oct 10 05:45:41 np0005479823 systemd[1]: var-lib-containers-storage-overlay-f408342537c5e6fdd036ccc0796c0a26f0667216386826a04c197b35d27c0187-merged.mount: Deactivated successfully.
Oct 10 05:45:41 np0005479823 podman[76768]: 2025-10-10 09:45:41.948464077 +0000 UTC m=+0.170049794 container remove b87903639b01baf62aa914e6ecab6a10120e54d91c311c24662f7bd1d9c2f3e2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_golick, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct 10 05:45:41 np0005479823 systemd[1]: libpod-conmon-b87903639b01baf62aa914e6ecab6a10120e54d91c311c24662f7bd1d9c2f3e2.scope: Deactivated successfully.
Oct 10 05:45:42 np0005479823 podman[76808]: 2025-10-10 09:45:42.127876409 +0000 UTC m=+0.043013955 container create c5da9621b18668ff242c3e19d6a9fbabe069ba17b42d5ecac305e9d57ea5fe36 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pedantic_ardinghelli, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Oct 10 05:45:42 np0005479823 systemd[1]: Started libpod-conmon-c5da9621b18668ff242c3e19d6a9fbabe069ba17b42d5ecac305e9d57ea5fe36.scope.
Oct 10 05:45:42 np0005479823 systemd[1]: Started libcrun container.
Oct 10 05:45:42 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb225fa0f9c578bd5c5442ba3963afd765519ae78f045d740cbd09adf79c86e3/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:42 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb225fa0f9c578bd5c5442ba3963afd765519ae78f045d740cbd09adf79c86e3/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:42 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb225fa0f9c578bd5c5442ba3963afd765519ae78f045d740cbd09adf79c86e3/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:42 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb225fa0f9c578bd5c5442ba3963afd765519ae78f045d740cbd09adf79c86e3/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:42 np0005479823 podman[76808]: 2025-10-10 09:45:42.108093787 +0000 UTC m=+0.023231383 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:45:42 np0005479823 podman[76808]: 2025-10-10 09:45:42.208649719 +0000 UTC m=+0.123787295 container init c5da9621b18668ff242c3e19d6a9fbabe069ba17b42d5ecac305e9d57ea5fe36 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pedantic_ardinghelli, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.40.1)
Oct 10 05:45:42 np0005479823 podman[76808]: 2025-10-10 09:45:42.222311206 +0000 UTC m=+0.137448772 container start c5da9621b18668ff242c3e19d6a9fbabe069ba17b42d5ecac305e9d57ea5fe36 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pedantic_ardinghelli, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Oct 10 05:45:42 np0005479823 podman[76808]: 2025-10-10 09:45:42.227378227 +0000 UTC m=+0.142515813 container attach c5da9621b18668ff242c3e19d6a9fbabe069ba17b42d5ecac305e9d57ea5fe36 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pedantic_ardinghelli, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 05:45:42 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/1162723757' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]: dispatch
Oct 10 05:45:42 np0005479823 ceph-mon[74913]: Health check update: 4 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Oct 10 05:45:42 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Oct 10 05:45:42 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Oct 10 05:45:42 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Oct 10 05:45:42 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/1162723757' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Oct 10 05:45:42 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]: dispatch
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]: {
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:    "2": [
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:        {
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:            "devices": [
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:                "/dev/loop3"
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:            ],
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:            "lv_name": "ceph_lv0",
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:            "lv_size": "21470642176",
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=ohvxnl-h5B1-cd0V-szWk-w8oI-A7ra-lPf83P,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=21f084a3-af34-5230-afe4-ea5cd24a55f4,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=fd47bcfa-dab9-466a-b4bb-0169e493040a,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:            "lv_uuid": "ohvxnl-h5B1-cd0V-szWk-w8oI-A7ra-lPf83P",
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:            "name": "ceph_lv0",
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:            "path": "/dev/ceph_vg0/ceph_lv0",
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:            "tags": {
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:                "ceph.block_uuid": "ohvxnl-h5B1-cd0V-szWk-w8oI-A7ra-lPf83P",
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:                "ceph.cephx_lockbox_secret": "",
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:                "ceph.cluster_fsid": "21f084a3-af34-5230-afe4-ea5cd24a55f4",
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:                "ceph.cluster_name": "ceph",
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:                "ceph.crush_device_class": "",
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:                "ceph.encrypted": "0",
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:                "ceph.osd_fsid": "fd47bcfa-dab9-466a-b4bb-0169e493040a",
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:                "ceph.osd_id": "2",
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:                "ceph.osdspec_affinity": "default_drive_group",
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:                "ceph.type": "block",
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:                "ceph.vdo": "0",
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:                "ceph.with_tpm": "0"
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:            },
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:            "type": "block",
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:            "vg_name": "ceph_vg0"
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:        }
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]:    ]
Oct 10 05:45:42 np0005479823 pedantic_ardinghelli[76825]: }
Oct 10 05:45:42 np0005479823 systemd[1]: libpod-c5da9621b18668ff242c3e19d6a9fbabe069ba17b42d5ecac305e9d57ea5fe36.scope: Deactivated successfully.
Oct 10 05:45:42 np0005479823 podman[76808]: 2025-10-10 09:45:42.505896136 +0000 UTC m=+0.421033792 container died c5da9621b18668ff242c3e19d6a9fbabe069ba17b42d5ecac305e9d57ea5fe36 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pedantic_ardinghelli, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 05:45:42 np0005479823 systemd[1]: var-lib-containers-storage-overlay-bb225fa0f9c578bd5c5442ba3963afd765519ae78f045d740cbd09adf79c86e3-merged.mount: Deactivated successfully.
Oct 10 05:45:42 np0005479823 podman[76808]: 2025-10-10 09:45:42.559118426 +0000 UTC m=+0.474256032 container remove c5da9621b18668ff242c3e19d6a9fbabe069ba17b42d5ecac305e9d57ea5fe36 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pedantic_ardinghelli, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1)
Oct 10 05:45:42 np0005479823 systemd[1]: libpod-conmon-c5da9621b18668ff242c3e19d6a9fbabe069ba17b42d5ecac305e9d57ea5fe36.scope: Deactivated successfully.
Oct 10 05:45:42 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e26 e26: 3 total, 2 up, 3 in
Oct 10 05:45:43 np0005479823 podman[76936]: 2025-10-10 09:45:43.288313872 +0000 UTC m=+0.056638251 container create 63e9bee07ddb173b3e4d5cfde54d118983d3cde1fdc3073e5d1eba7367c2a5c0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_knuth, OSD_FLAVOR=default, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 05:45:43 np0005479823 systemd[1]: Started libpod-conmon-63e9bee07ddb173b3e4d5cfde54d118983d3cde1fdc3073e5d1eba7367c2a5c0.scope.
Oct 10 05:45:43 np0005479823 podman[76936]: 2025-10-10 09:45:43.261491175 +0000 UTC m=+0.029815604 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:45:43 np0005479823 systemd[1]: Started libcrun container.
Oct 10 05:45:43 np0005479823 podman[76936]: 2025-10-10 09:45:43.377678737 +0000 UTC m=+0.146003126 container init 63e9bee07ddb173b3e4d5cfde54d118983d3cde1fdc3073e5d1eba7367c2a5c0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_knuth, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Oct 10 05:45:43 np0005479823 podman[76936]: 2025-10-10 09:45:43.387977785 +0000 UTC m=+0.156302144 container start 63e9bee07ddb173b3e4d5cfde54d118983d3cde1fdc3073e5d1eba7367c2a5c0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_knuth, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=squid)
Oct 10 05:45:43 np0005479823 podman[76936]: 2025-10-10 09:45:43.391548 +0000 UTC m=+0.159872399 container attach 63e9bee07ddb173b3e4d5cfde54d118983d3cde1fdc3073e5d1eba7367c2a5c0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_knuth, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Oct 10 05:45:43 np0005479823 sleepy_knuth[76952]: 167 167
Oct 10 05:45:43 np0005479823 systemd[1]: libpod-63e9bee07ddb173b3e4d5cfde54d118983d3cde1fdc3073e5d1eba7367c2a5c0.scope: Deactivated successfully.
Oct 10 05:45:43 np0005479823 podman[76936]: 2025-10-10 09:45:43.395875948 +0000 UTC m=+0.164200377 container died 63e9bee07ddb173b3e4d5cfde54d118983d3cde1fdc3073e5d1eba7367c2a5c0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_knuth, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_REF=squid)
Oct 10 05:45:43 np0005479823 systemd[1]: var-lib-containers-storage-overlay-98e49d422ac95e2b5dedd5b14a202537dfd3b9926624b62d90bb03d57dbdbbd5-merged.mount: Deactivated successfully.
Oct 10 05:45:43 np0005479823 podman[76936]: 2025-10-10 09:45:43.455583905 +0000 UTC m=+0.223908284 container remove 63e9bee07ddb173b3e4d5cfde54d118983d3cde1fdc3073e5d1eba7367c2a5c0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sleepy_knuth, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325)
Oct 10 05:45:43 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Oct 10 05:45:43 np0005479823 ceph-mon[74913]: Deploying daemon osd.2 on compute-2
Oct 10 05:45:43 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/616535579' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Oct 10 05:45:43 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]': finished
Oct 10 05:45:43 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct 10 05:45:43 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct 10 05:45:43 np0005479823 systemd[1]: libpod-conmon-63e9bee07ddb173b3e4d5cfde54d118983d3cde1fdc3073e5d1eba7367c2a5c0.scope: Deactivated successfully.
Oct 10 05:45:43 np0005479823 podman[76982]: 2025-10-10 09:45:43.758396039 +0000 UTC m=+0.045607068 container create 7d12b347e9b92e960c2915bb1339d752fcc612d03786e5e07a1e0d7d6e879545 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate-test, io.buildah.version=1.40.1, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Oct 10 05:45:43 np0005479823 systemd[1]: Started libpod-conmon-7d12b347e9b92e960c2915bb1339d752fcc612d03786e5e07a1e0d7d6e879545.scope.
Oct 10 05:45:43 np0005479823 podman[76982]: 2025-10-10 09:45:43.739683962 +0000 UTC m=+0.026895021 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:45:43 np0005479823 systemd[1]: Started libcrun container.
Oct 10 05:45:43 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/604021fb1be3d29eb91c29552935f979d971165ea3db966be9009b29144bba40/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:43 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/604021fb1be3d29eb91c29552935f979d971165ea3db966be9009b29144bba40/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:43 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/604021fb1be3d29eb91c29552935f979d971165ea3db966be9009b29144bba40/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:43 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/604021fb1be3d29eb91c29552935f979d971165ea3db966be9009b29144bba40/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:43 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/604021fb1be3d29eb91c29552935f979d971165ea3db966be9009b29144bba40/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:43 np0005479823 podman[76982]: 2025-10-10 09:45:43.878139665 +0000 UTC m=+0.165350744 container init 7d12b347e9b92e960c2915bb1339d752fcc612d03786e5e07a1e0d7d6e879545 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate-test, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.40.1)
Oct 10 05:45:43 np0005479823 podman[76982]: 2025-10-10 09:45:43.887162583 +0000 UTC m=+0.174373602 container start 7d12b347e9b92e960c2915bb1339d752fcc612d03786e5e07a1e0d7d6e879545 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate-test, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 05:45:43 np0005479823 podman[76982]: 2025-10-10 09:45:43.890999606 +0000 UTC m=+0.178210685 container attach 7d12b347e9b92e960c2915bb1339d752fcc612d03786e5e07a1e0d7d6e879545 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate-test, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 05:45:43 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e27 e27: 3 total, 2 up, 3 in
Oct 10 05:45:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate-test[76999]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_FSID]
Oct 10 05:45:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate-test[76999]:                            [--no-systemd] [--no-tmpfs]
Oct 10 05:45:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate-test[76999]: ceph-volume activate: error: unrecognized arguments: --bad-option
Oct 10 05:45:44 np0005479823 systemd[1]: libpod-7d12b347e9b92e960c2915bb1339d752fcc612d03786e5e07a1e0d7d6e879545.scope: Deactivated successfully.
Oct 10 05:45:44 np0005479823 podman[76982]: 2025-10-10 09:45:44.078078333 +0000 UTC m=+0.365289362 container died 7d12b347e9b92e960c2915bb1339d752fcc612d03786e5e07a1e0d7d6e879545 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate-test, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1)
Oct 10 05:45:44 np0005479823 systemd[1]: var-lib-containers-storage-overlay-604021fb1be3d29eb91c29552935f979d971165ea3db966be9009b29144bba40-merged.mount: Deactivated successfully.
Oct 10 05:45:44 np0005479823 podman[76982]: 2025-10-10 09:45:44.125918981 +0000 UTC m=+0.413130050 container remove 7d12b347e9b92e960c2915bb1339d752fcc612d03786e5e07a1e0d7d6e879545 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate-test, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250325)
Oct 10 05:45:44 np0005479823 systemd[1]: libpod-conmon-7d12b347e9b92e960c2915bb1339d752fcc612d03786e5e07a1e0d7d6e879545.scope: Deactivated successfully.
Oct 10 05:45:44 np0005479823 systemd[1]: Reloading.
Oct 10 05:45:44 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:45:44 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:45:44 np0005479823 systemd[1]: Reloading.
Oct 10 05:45:44 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:45:44 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:45:44 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e27 _set_new_cache_sizes cache_size:1020054711 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:45:44 np0005479823 systemd[1]: Starting Ceph osd.2 for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 05:45:44 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/616535579' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Oct 10 05:45:44 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]': finished
Oct 10 05:45:44 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Oct 10 05:45:44 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e28 e28: 3 total, 2 up, 3 in
Oct 10 05:45:45 np0005479823 podman[77157]: 2025-10-10 09:45:45.108245704 +0000 UTC m=+0.036616411 container create a2377887547ba9726146884c0c0e73572d0a95e5b700832a1a1ecf1a3990433e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 05:45:45 np0005479823 systemd[1]: Started libcrun container.
Oct 10 05:45:45 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/443d5e728f58b50b1d4a6d47d52bda03bfabb6d57f94682f44caf9d3ca21e440/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:45 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/443d5e728f58b50b1d4a6d47d52bda03bfabb6d57f94682f44caf9d3ca21e440/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:45 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/443d5e728f58b50b1d4a6d47d52bda03bfabb6d57f94682f44caf9d3ca21e440/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:45 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/443d5e728f58b50b1d4a6d47d52bda03bfabb6d57f94682f44caf9d3ca21e440/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:45 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/443d5e728f58b50b1d4a6d47d52bda03bfabb6d57f94682f44caf9d3ca21e440/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:45 np0005479823 podman[77157]: 2025-10-10 09:45:45.174355936 +0000 UTC m=+0.102726683 container init a2377887547ba9726146884c0c0e73572d0a95e5b700832a1a1ecf1a3990433e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 05:45:45 np0005479823 podman[77157]: 2025-10-10 09:45:45.185206082 +0000 UTC m=+0.113576769 container start a2377887547ba9726146884c0c0e73572d0a95e5b700832a1a1ecf1a3990433e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Oct 10 05:45:45 np0005479823 podman[77157]: 2025-10-10 09:45:45.091882071 +0000 UTC m=+0.020252798 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:45:45 np0005479823 podman[77157]: 2025-10-10 09:45:45.18858057 +0000 UTC m=+0.116951357 container attach a2377887547ba9726146884c0c0e73572d0a95e5b700832a1a1ecf1a3990433e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Oct 10 05:45:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate[77172]: Running command: /usr/bin/ceph-authtool --gen-print-key
Oct 10 05:45:45 np0005479823 bash[77157]: Running command: /usr/bin/ceph-authtool --gen-print-key
Oct 10 05:45:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate[77172]: Running command: /usr/bin/ceph-authtool --gen-print-key
Oct 10 05:45:45 np0005479823 bash[77157]: Running command: /usr/bin/ceph-authtool --gen-print-key
Oct 10 05:45:45 np0005479823 lvm[77253]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Oct 10 05:45:45 np0005479823 lvm[77253]: VG ceph_vg0 finished
Oct 10 05:45:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate[77172]: --> Failed to activate via raw: did not find any matching OSD to activate
Oct 10 05:45:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate[77172]: Running command: /usr/bin/ceph-authtool --gen-print-key
Oct 10 05:45:46 np0005479823 bash[77157]: --> Failed to activate via raw: did not find any matching OSD to activate
Oct 10 05:45:46 np0005479823 bash[77157]: Running command: /usr/bin/ceph-authtool --gen-print-key
Oct 10 05:45:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate[77172]: Running command: /usr/bin/ceph-authtool --gen-print-key
Oct 10 05:45:46 np0005479823 bash[77157]: Running command: /usr/bin/ceph-authtool --gen-print-key
Oct 10 05:45:46 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/2263940004' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Oct 10 05:45:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate[77172]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Oct 10 05:45:46 np0005479823 bash[77157]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Oct 10 05:45:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate[77172]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Oct 10 05:45:46 np0005479823 bash[77157]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Oct 10 05:45:46 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e29 e29: 3 total, 2 up, 3 in
Oct 10 05:45:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate[77172]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Oct 10 05:45:46 np0005479823 bash[77157]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Oct 10 05:45:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate[77172]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Oct 10 05:45:46 np0005479823 bash[77157]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Oct 10 05:45:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate[77172]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Oct 10 05:45:46 np0005479823 bash[77157]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Oct 10 05:45:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate[77172]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Oct 10 05:45:46 np0005479823 bash[77157]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Oct 10 05:45:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate[77172]: --> ceph-volume lvm activate successful for osd ID: 2
Oct 10 05:45:46 np0005479823 bash[77157]: --> ceph-volume lvm activate successful for osd ID: 2
Oct 10 05:45:46 np0005479823 systemd[1]: libpod-a2377887547ba9726146884c0c0e73572d0a95e5b700832a1a1ecf1a3990433e.scope: Deactivated successfully.
Oct 10 05:45:46 np0005479823 systemd[1]: libpod-a2377887547ba9726146884c0c0e73572d0a95e5b700832a1a1ecf1a3990433e.scope: Consumed 1.395s CPU time.
Oct 10 05:45:46 np0005479823 podman[77157]: 2025-10-10 09:45:46.541525464 +0000 UTC m=+1.469896161 container died a2377887547ba9726146884c0c0e73572d0a95e5b700832a1a1ecf1a3990433e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate, io.buildah.version=1.40.1, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 05:45:46 np0005479823 systemd[1]: var-lib-containers-storage-overlay-443d5e728f58b50b1d4a6d47d52bda03bfabb6d57f94682f44caf9d3ca21e440-merged.mount: Deactivated successfully.
Oct 10 05:45:46 np0005479823 podman[77157]: 2025-10-10 09:45:46.588356612 +0000 UTC m=+1.516727299 container remove a2377887547ba9726146884c0c0e73572d0a95e5b700832a1a1ecf1a3990433e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2-activate, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 05:45:46 np0005479823 podman[77404]: 2025-10-10 09:45:46.784332284 +0000 UTC m=+0.037232320 container create 0aa08009f7f58cec73f1dc2a942b136c595874bb56d58c9180505887d3182275 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Oct 10 05:45:46 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/96309a322ca16c3212575cb90ed0e5f7f34baee3f898bb9fff6636016c1612c0/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:46 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/96309a322ca16c3212575cb90ed0e5f7f34baee3f898bb9fff6636016c1612c0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:46 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/96309a322ca16c3212575cb90ed0e5f7f34baee3f898bb9fff6636016c1612c0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:46 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/96309a322ca16c3212575cb90ed0e5f7f34baee3f898bb9fff6636016c1612c0/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:46 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/96309a322ca16c3212575cb90ed0e5f7f34baee3f898bb9fff6636016c1612c0/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:46 np0005479823 podman[77404]: 2025-10-10 09:45:46.837650998 +0000 UTC m=+0.090551044 container init 0aa08009f7f58cec73f1dc2a942b136c595874bb56d58c9180505887d3182275 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0)
Oct 10 05:45:46 np0005479823 podman[77404]: 2025-10-10 09:45:46.844582439 +0000 UTC m=+0.097482475 container start 0aa08009f7f58cec73f1dc2a942b136c595874bb56d58c9180505887d3182275 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid)
Oct 10 05:45:46 np0005479823 bash[77404]: 0aa08009f7f58cec73f1dc2a942b136c595874bb56d58c9180505887d3182275
Oct 10 05:45:46 np0005479823 podman[77404]: 2025-10-10 09:45:46.767854266 +0000 UTC m=+0.020754332 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:45:46 np0005479823 systemd[1]: Started Ceph osd.2 for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 05:45:46 np0005479823 ceph-osd[77423]: set uid:gid to 167:167 (ceph:ceph)
Oct 10 05:45:46 np0005479823 ceph-osd[77423]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-osd, pid 2
Oct 10 05:45:46 np0005479823 ceph-osd[77423]: pidfile_write: ignore empty --pid-file
Oct 10 05:45:46 np0005479823 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 05:45:46 np0005479823 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 05:45:46 np0005479823 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 05:45:46 np0005479823 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) close
Oct 10 05:45:47 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/2263940004' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Oct 10 05:45:47 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:47 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:47 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:47 np0005479823 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 05:45:47 np0005479823 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 05:45:47 np0005479823 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 05:45:47 np0005479823 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) close
Oct 10 05:45:47 np0005479823 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 05:45:47 np0005479823 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 05:45:47 np0005479823 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 05:45:47 np0005479823 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) close
Oct 10 05:45:47 np0005479823 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 05:45:47 np0005479823 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 05:45:47 np0005479823 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 05:45:47 np0005479823 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) close
Oct 10 05:45:47 np0005479823 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 05:45:47 np0005479823 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 05:45:47 np0005479823 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 05:45:47 np0005479823 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) close
Oct 10 05:45:47 np0005479823 podman[77533]: 2025-10-10 09:45:47.40952556 +0000 UTC m=+0.040138408 container create f5ecf4cfa714db0fb4bd33ca4af587d0ba8733243d08e04d3c4655142d39b62f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=romantic_galileo, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 05:45:47 np0005479823 systemd[1]: Started libpod-conmon-f5ecf4cfa714db0fb4bd33ca4af587d0ba8733243d08e04d3c4655142d39b62f.scope.
Oct 10 05:45:47 np0005479823 systemd[1]: Started libcrun container.
Oct 10 05:45:47 np0005479823 podman[77533]: 2025-10-10 09:45:47.389765392 +0000 UTC m=+0.020378260 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:45:47 np0005479823 podman[77533]: 2025-10-10 09:45:47.494319902 +0000 UTC m=+0.124932740 container init f5ecf4cfa714db0fb4bd33ca4af587d0ba8733243d08e04d3c4655142d39b62f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=romantic_galileo, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 05:45:47 np0005479823 podman[77533]: 2025-10-10 09:45:47.500174777 +0000 UTC m=+0.130787605 container start f5ecf4cfa714db0fb4bd33ca4af587d0ba8733243d08e04d3c4655142d39b62f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=romantic_galileo, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1)
Oct 10 05:45:47 np0005479823 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 05:45:47 np0005479823 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 05:45:47 np0005479823 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 05:45:47 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Oct 10 05:45:47 np0005479823 ceph-osd[77423]: bdev(0x55cb2147b800 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 05:45:47 np0005479823 ceph-osd[77423]: bdev(0x55cb2147b800 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 05:45:47 np0005479823 podman[77533]: 2025-10-10 09:45:47.505063639 +0000 UTC m=+0.135676487 container attach f5ecf4cfa714db0fb4bd33ca4af587d0ba8733243d08e04d3c4655142d39b62f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=romantic_galileo, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325)
Oct 10 05:45:47 np0005479823 romantic_galileo[77549]: 167 167
Oct 10 05:45:47 np0005479823 ceph-osd[77423]: bdev(0x55cb2147b800 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 05:45:47 np0005479823 ceph-osd[77423]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Oct 10 05:45:47 np0005479823 systemd[1]: libpod-f5ecf4cfa714db0fb4bd33ca4af587d0ba8733243d08e04d3c4655142d39b62f.scope: Deactivated successfully.
Oct 10 05:45:47 np0005479823 ceph-osd[77423]: bdev(0x55cb2147b800 /var/lib/ceph/osd/ceph-2/block) close
Oct 10 05:45:47 np0005479823 podman[77533]: 2025-10-10 09:45:47.505968769 +0000 UTC m=+0.136581617 container died f5ecf4cfa714db0fb4bd33ca4af587d0ba8733243d08e04d3c4655142d39b62f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=romantic_galileo, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 05:45:47 np0005479823 systemd[1]: var-lib-containers-storage-overlay-0e1ef57bd601e4c5f84541d5e40ced1ef613c37c332d977c2a646394b06cea23-merged.mount: Deactivated successfully.
Oct 10 05:45:47 np0005479823 podman[77533]: 2025-10-10 09:45:47.553089127 +0000 UTC m=+0.183701945 container remove f5ecf4cfa714db0fb4bd33ca4af587d0ba8733243d08e04d3c4655142d39b62f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=romantic_galileo, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Oct 10 05:45:47 np0005479823 systemd[1]: libpod-conmon-f5ecf4cfa714db0fb4bd33ca4af587d0ba8733243d08e04d3c4655142d39b62f.scope: Deactivated successfully.
Oct 10 05:45:47 np0005479823 podman[77576]: 2025-10-10 09:45:47.728900548 +0000 UTC m=+0.053723389 container create 043a39333ddd64012fdb83c20b4b863c2e79dffbea4a243c871c85ad76147e08 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_solomon, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Oct 10 05:45:47 np0005479823 systemd[1]: Started libpod-conmon-043a39333ddd64012fdb83c20b4b863c2e79dffbea4a243c871c85ad76147e08.scope.
Oct 10 05:45:47 np0005479823 ceph-osd[77423]: bdev(0x55cb2147bc00 /var/lib/ceph/osd/ceph-2/block) close
Oct 10 05:45:47 np0005479823 podman[77576]: 2025-10-10 09:45:47.697296656 +0000 UTC m=+0.022119587 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:45:47 np0005479823 systemd[1]: Started libcrun container.
Oct 10 05:45:47 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e4f6e9f6126da2100b98796ec2e0372b5256720ab4fe80f6f3f31bffe791e689/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:47 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e4f6e9f6126da2100b98796ec2e0372b5256720ab4fe80f6f3f31bffe791e689/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:47 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e4f6e9f6126da2100b98796ec2e0372b5256720ab4fe80f6f3f31bffe791e689/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:47 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e4f6e9f6126da2100b98796ec2e0372b5256720ab4fe80f6f3f31bffe791e689/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:47 np0005479823 podman[77576]: 2025-10-10 09:45:47.813760422 +0000 UTC m=+0.138583283 container init 043a39333ddd64012fdb83c20b4b863c2e79dffbea4a243c871c85ad76147e08 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_solomon, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Oct 10 05:45:47 np0005479823 podman[77576]: 2025-10-10 09:45:47.829024089 +0000 UTC m=+0.153846930 container start 043a39333ddd64012fdb83c20b4b863c2e79dffbea4a243c871c85ad76147e08 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_solomon, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 05:45:47 np0005479823 podman[77576]: 2025-10-10 09:45:47.833472008 +0000 UTC m=+0.158294869 container attach 043a39333ddd64012fdb83c20b4b863c2e79dffbea4a243c871c85ad76147e08 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_solomon, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True)
Oct 10 05:45:48 np0005479823 ceph-osd[77423]: starting osd.2 osd_data /var/lib/ceph/osd/ceph-2 /var/lib/ceph/osd/ceph-2/journal
Oct 10 05:45:48 np0005479823 ceph-osd[77423]: load: jerasure load: lrc 
Oct 10 05:45:48 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/2169807361' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Oct 10 05:45:48 np0005479823 ceph-mon[74913]: Health check update: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Oct 10 05:45:48 np0005479823 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 05:45:48 np0005479823 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 05:45:48 np0005479823 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 05:45:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Oct 10 05:45:48 np0005479823 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) close
Oct 10 05:45:48 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e30 e30: 3 total, 2 up, 3 in
Oct 10 05:45:48 np0005479823 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 05:45:48 np0005479823 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 05:45:48 np0005479823 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 05:45:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Oct 10 05:45:48 np0005479823 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) close
Oct 10 05:45:48 np0005479823 lvm[77676]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Oct 10 05:45:48 np0005479823 lvm[77676]: VG ceph_vg0 finished
Oct 10 05:45:48 np0005479823 ceph-osd[77423]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Oct 10 05:45:48 np0005479823 strange_solomon[77592]: {}
Oct 10 05:45:48 np0005479823 ceph-osd[77423]: osd.2:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Oct 10 05:45:48 np0005479823 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 05:45:48 np0005479823 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 05:45:48 np0005479823 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 05:45:48 np0005479823 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) close
Oct 10 05:45:48 np0005479823 systemd[1]: libpod-043a39333ddd64012fdb83c20b4b863c2e79dffbea4a243c871c85ad76147e08.scope: Deactivated successfully.
Oct 10 05:45:48 np0005479823 podman[77576]: 2025-10-10 09:45:48.630039186 +0000 UTC m=+0.954862017 container died 043a39333ddd64012fdb83c20b4b863c2e79dffbea4a243c871c85ad76147e08 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_solomon, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 05:45:48 np0005479823 systemd[1]: libpod-043a39333ddd64012fdb83c20b4b863c2e79dffbea4a243c871c85ad76147e08.scope: Consumed 1.241s CPU time.
Oct 10 05:45:48 np0005479823 systemd[1]: var-lib-containers-storage-overlay-e4f6e9f6126da2100b98796ec2e0372b5256720ab4fe80f6f3f31bffe791e689-merged.mount: Deactivated successfully.
Oct 10 05:45:48 np0005479823 podman[77576]: 2025-10-10 09:45:48.680997392 +0000 UTC m=+1.005820223 container remove 043a39333ddd64012fdb83c20b4b863c2e79dffbea4a243c871c85ad76147e08 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_solomon, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 05:45:48 np0005479823 systemd[1]: libpod-conmon-043a39333ddd64012fdb83c20b4b863c2e79dffbea4a243c871c85ad76147e08.scope: Deactivated successfully.
Oct 10 05:45:48 np0005479823 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 05:45:48 np0005479823 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 05:45:48 np0005479823 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 05:45:48 np0005479823 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) close
Oct 10 05:45:49 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/2169807361' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Oct 10 05:45:49 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:49 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) close
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bdev(0x55cb222fac00 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bdev(0x55cb222fb000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bdev(0x55cb222fb000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bdev(0x55cb222fb000 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluefs mount
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluefs mount shared_bdev_used = 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: RocksDB version: 7.9.2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Git sha 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Compile date 2025-07-17 03:12:14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: DB SUMMARY
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: DB Session ID:  E13QT5JWED64DXY9YRGI
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: CURRENT file:  CURRENT
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: IDENTITY file:  IDENTITY
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                         Options.error_if_exists: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.create_if_missing: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                         Options.paranoid_checks: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.flush_verify_memtable_count: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                                     Options.env: 0x55cb214cf650
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                                      Options.fs: LegacyFileSystem
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                                Options.info_log: 0x55cb222ff6e0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.max_file_opening_threads: 16
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                              Options.statistics: (nil)
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                               Options.use_fsync: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.max_log_file_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.log_file_time_to_roll: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.keep_log_file_num: 1000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.recycle_log_file_num: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                         Options.allow_fallocate: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.allow_mmap_reads: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.allow_mmap_writes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.use_direct_reads: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.create_missing_column_families: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                              Options.db_log_dir: 
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                                 Options.wal_dir: db.wal
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.table_cache_numshardbits: 6
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                         Options.WAL_ttl_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.WAL_size_limit_MB: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.manifest_preallocation_size: 4194304
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                     Options.is_fd_close_on_exec: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.advise_random_on_open: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.db_write_buffer_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.write_buffer_manager: 0x55cb223f2a00
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.access_hint_on_compaction_start: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                      Options.use_adaptive_mutex: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                            Options.rate_limiter: (nil)
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.wal_recovery_mode: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.enable_thread_tracking: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.enable_pipelined_write: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.unordered_write: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.write_thread_max_yield_usec: 100
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                               Options.row_cache: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                              Options.wal_filter: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.avoid_flush_during_recovery: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.allow_ingest_behind: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.two_write_queues: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.manual_wal_flush: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.wal_compression: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.atomic_flush: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.persist_stats_to_disk: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.write_dbid_to_manifest: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.log_readahead_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.best_efforts_recovery: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.allow_data_in_errors: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.db_host_id: __hostname__
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.enforce_single_del_contracts: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.max_background_jobs: 4
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.max_background_compactions: -1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.max_subcompactions: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:           Options.writable_file_max_buffer_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.delayed_write_rate : 16777216
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.max_total_wal_size: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.stats_dump_period_sec: 600
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.stats_persist_period_sec: 600
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.max_open_files: -1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.bytes_per_sync: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                      Options.wal_bytes_per_sync: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.strict_bytes_per_sync: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.compaction_readahead_size: 2097152
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.max_background_flushes: -1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Compression algorithms supported:
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: #011kZSTD supported: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: #011kXpressCompression supported: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: #011kBZip2Compression supported: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: #011kLZ4Compression supported: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: #011kZlibCompression supported: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: #011kLZ4HCCompression supported: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: #011kSnappyCompression supported: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Fast CRC32 supported: Supported on x86
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: DMutex implementation: pthread_mutex_t
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ffaa0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55cb21511350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ffaa0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55cb21511350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ffaa0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55cb21511350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ffaa0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55cb21511350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ffaa0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55cb21511350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ffaa0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55cb21511350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ffaa0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55cb21511350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ffac0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55cb215109b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ffac0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55cb215109b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ffac0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55cb215109b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: d2bae292-d520-4d17-8daf-8d5d2d3cbf01
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089549452625, "job": 1, "event": "recovery_started", "wal_files": [31]}
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089549452919, "job": 1, "event": "recovery_finished"}
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old nid_max 1025
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old blobid_max 10240
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta min_alloc_size 0x1000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: freelist init
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: freelist _read_cfg
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _init_alloc loaded 20 GiB in 2 extents, allocator type hybrid, capacity 0x4ffc00000, block size 0x1000, free 0x4ffbfd000, fragmentation 1.9e-07
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluefs umount
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bdev(0x55cb222fb000 /var/lib/ceph/osd/ceph-2/block) close
Oct 10 05:45:49 np0005479823 podman[78043]: 2025-10-10 09:45:49.701114719 +0000 UTC m=+0.055028282 container exec bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bdev(0x55cb222fb000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bdev(0x55cb222fb000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bdev(0x55cb222fb000 /var/lib/ceph/osd/ceph-2/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 20 GiB
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluefs mount
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluefs mount shared_bdev_used = 4718592
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: RocksDB version: 7.9.2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Git sha 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Compile date 2025-07-17 03:12:14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: DB SUMMARY
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: DB Session ID:  E13QT5JWED64DXY9YRGJ
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: CURRENT file:  CURRENT
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: IDENTITY file:  IDENTITY
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                         Options.error_if_exists: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.create_if_missing: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                         Options.paranoid_checks: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.flush_verify_memtable_count: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                                     Options.env: 0x55cb214cf110
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                                      Options.fs: LegacyFileSystem
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                                Options.info_log: 0x55cb222ff860
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.max_file_opening_threads: 16
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                              Options.statistics: (nil)
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                               Options.use_fsync: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.max_log_file_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.log_file_time_to_roll: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.keep_log_file_num: 1000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.recycle_log_file_num: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                         Options.allow_fallocate: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.allow_mmap_reads: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.allow_mmap_writes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.use_direct_reads: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.create_missing_column_families: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                              Options.db_log_dir: 
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                                 Options.wal_dir: db.wal
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.table_cache_numshardbits: 6
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                         Options.WAL_ttl_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.WAL_size_limit_MB: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.manifest_preallocation_size: 4194304
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                     Options.is_fd_close_on_exec: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.advise_random_on_open: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.db_write_buffer_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.write_buffer_manager: 0x55cb223f2a00
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.access_hint_on_compaction_start: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                      Options.use_adaptive_mutex: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                            Options.rate_limiter: (nil)
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.wal_recovery_mode: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.enable_thread_tracking: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.enable_pipelined_write: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.unordered_write: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.write_thread_max_yield_usec: 100
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                               Options.row_cache: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                              Options.wal_filter: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.avoid_flush_during_recovery: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.allow_ingest_behind: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.two_write_queues: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.manual_wal_flush: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.wal_compression: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.atomic_flush: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.persist_stats_to_disk: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.write_dbid_to_manifest: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.log_readahead_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.best_efforts_recovery: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.allow_data_in_errors: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.db_host_id: __hostname__
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.enforce_single_del_contracts: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.max_background_jobs: 4
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.max_background_compactions: -1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.max_subcompactions: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:           Options.writable_file_max_buffer_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.delayed_write_rate : 16777216
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.max_total_wal_size: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.stats_dump_period_sec: 600
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.stats_persist_period_sec: 600
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.max_open_files: -1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.bytes_per_sync: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                      Options.wal_bytes_per_sync: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.strict_bytes_per_sync: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.compaction_readahead_size: 2097152
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.max_background_flushes: -1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Compression algorithms supported:
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: #011kZSTD supported: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: #011kXpressCompression supported: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: #011kBZip2Compression supported: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: #011kLZ4Compression supported: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: #011kZlibCompression supported: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: #011kLZ4HCCompression supported: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: #011kSnappyCompression supported: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Fast CRC32 supported: Supported on x86
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: DMutex implementation: pthread_mutex_t
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ff5c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55cb21511350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ff5c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55cb21511350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ff5c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55cb21511350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ff5c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55cb21511350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ff5c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55cb21511350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ff5c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55cb21511350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ff5c0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55cb21511350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ffa00)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55cb215109b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ffa00)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55cb215109b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:           Options.merge_operator: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.compaction_filter_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.sst_partitioner_factory: None
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.memtable_factory: SkipListFactory
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.table_factory: BlockBasedTable
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cb222ffa00)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55cb215109b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.write_buffer_size: 16777216
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.max_write_buffer_number: 64
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.compression: LZ4
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression: Disabled
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.num_levels: 7
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:            Options.compression_opts.window_bits: -14
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.level: 32767
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.compression_opts.strategy: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.parallel_threads: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                  Options.compression_opts.enabled: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:              Options.level0_stop_writes_trigger: 36
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.target_file_size_base: 67108864
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:             Options.target_file_size_multiplier: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.arena_block_size: 1048576
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.disable_auto_compactions: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.inplace_update_support: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                 Options.inplace_update_num_locks: 10000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:               Options.memtable_whole_key_filtering: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:   Options.memtable_huge_page_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.bloom_locality: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                    Options.max_successive_merges: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.optimize_filters_for_hits: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.paranoid_file_checks: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.force_consistency_checks: 1
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.report_bg_io_stats: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                               Options.ttl: 2592000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.periodic_compaction_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:    Options.preserve_internal_time_seconds: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                       Options.enable_blob_files: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                           Options.min_blob_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                          Options.blob_file_size: 268435456
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                   Options.blob_compression_type: NoCompression
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.enable_blob_garbage_collection: false
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:          Options.blob_compaction_readahead_size: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb:                Options.blob_file_starting_level: 0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: d2bae292-d520-4d17-8daf-8d5d2d3cbf01
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089549720041, "job": 1, "event": "recovery_started", "wal_files": [31]}
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089549724435, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089549, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d2bae292-d520-4d17-8daf-8d5d2d3cbf01", "db_session_id": "E13QT5JWED64DXY9YRGJ", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089549727147, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1594, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 571, "raw_average_value_size": 285, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089549, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d2bae292-d520-4d17-8daf-8d5d2d3cbf01", "db_session_id": "E13QT5JWED64DXY9YRGJ", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089549730214, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089549, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d2bae292-d520-4d17-8daf-8d5d2d3cbf01", "db_session_id": "E13QT5JWED64DXY9YRGJ", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089549731905, "job": 1, "event": "recovery_finished"}
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55cb224a2000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: DB pointer 0x55cb2263e000
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super from 4, latest 4
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super done
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.03 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 460.80 MB usag
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/hello/cls_hello.cc:316: loading cls_hello
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: _get_class not permitted to load lua
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: _get_class not permitted to load sdk
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: osd.2 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: osd.2 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: osd.2 0 load_pgs
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: osd.2 0 load_pgs opened 0 pgs
Oct 10 05:45:49 np0005479823 ceph-osd[77423]: osd.2 0 log_to_monitors true
Oct 10 05:45:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2[77419]: 2025-10-10T09:45:49.760+0000 7fcffd2e7740 -1 osd.2 0 log_to_monitors true
Oct 10 05:45:49 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]} v 0)
Oct 10 05:45:49 np0005479823 ceph-mon[74913]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/269809354,v1:192.168.122.102:6801/269809354]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Oct 10 05:45:49 np0005479823 podman[78043]: 2025-10-10 09:45:49.830702462 +0000 UTC m=+0.184616005 container exec_died bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 05:45:49 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e30 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:45:50 np0005479823 ceph-mon[74913]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Oct 10 05:45:50 np0005479823 ceph-mon[74913]: Cluster is now healthy
Oct 10 05:45:50 np0005479823 ceph-mon[74913]: from='osd.2 [v2:192.168.122.102:6800/269809354,v1:192.168.122.102:6801/269809354]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Oct 10 05:45:50 np0005479823 ceph-mon[74913]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Oct 10 05:45:50 np0005479823 podman[78434]: 2025-10-10 09:45:50.683094959 +0000 UTC m=+0.046854071 container create b76aaa774c25a7614cbfc05fef5e72ce71240858ea1482fb9abbc16b2aaf7062 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quizzical_herschel, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid)
Oct 10 05:45:50 np0005479823 systemd[1]: Started libpod-conmon-b76aaa774c25a7614cbfc05fef5e72ce71240858ea1482fb9abbc16b2aaf7062.scope.
Oct 10 05:45:50 np0005479823 systemd[1]: Started libcrun container.
Oct 10 05:45:50 np0005479823 podman[78434]: 2025-10-10 09:45:50.662544094 +0000 UTC m=+0.026303226 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:45:50 np0005479823 podman[78434]: 2025-10-10 09:45:50.763593447 +0000 UTC m=+0.127352569 container init b76aaa774c25a7614cbfc05fef5e72ce71240858ea1482fb9abbc16b2aaf7062 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quizzical_herschel, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 05:45:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e31 e31: 3 total, 2 up, 3 in
Oct 10 05:45:50 np0005479823 podman[78434]: 2025-10-10 09:45:50.775752102 +0000 UTC m=+0.139511214 container start b76aaa774c25a7614cbfc05fef5e72ce71240858ea1482fb9abbc16b2aaf7062 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quizzical_herschel, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 05:45:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]} v 0)
Oct 10 05:45:50 np0005479823 ceph-mon[74913]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/269809354,v1:192.168.122.102:6801/269809354]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Oct 10 05:45:50 np0005479823 quizzical_herschel[78450]: 167 167
Oct 10 05:45:50 np0005479823 podman[78434]: 2025-10-10 09:45:50.780414887 +0000 UTC m=+0.144174019 container attach b76aaa774c25a7614cbfc05fef5e72ce71240858ea1482fb9abbc16b2aaf7062 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quizzical_herschel, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS)
Oct 10 05:45:50 np0005479823 podman[78434]: 2025-10-10 09:45:50.781013627 +0000 UTC m=+0.144772739 container died b76aaa774c25a7614cbfc05fef5e72ce71240858ea1482fb9abbc16b2aaf7062 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quizzical_herschel, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325)
Oct 10 05:45:50 np0005479823 systemd[1]: libpod-b76aaa774c25a7614cbfc05fef5e72ce71240858ea1482fb9abbc16b2aaf7062.scope: Deactivated successfully.
Oct 10 05:45:50 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Oct 10 05:45:50 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Oct 10 05:45:50 np0005479823 systemd[1]: var-lib-containers-storage-overlay-d247c8f1c8b7a8af4d5813292218eac9436c073d5e9124d9019b6df28f87c77d-merged.mount: Deactivated successfully.
Oct 10 05:45:50 np0005479823 podman[78434]: 2025-10-10 09:45:50.820769639 +0000 UTC m=+0.184528761 container remove b76aaa774c25a7614cbfc05fef5e72ce71240858ea1482fb9abbc16b2aaf7062 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=quizzical_herschel, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Oct 10 05:45:50 np0005479823 systemd[1]: libpod-conmon-b76aaa774c25a7614cbfc05fef5e72ce71240858ea1482fb9abbc16b2aaf7062.scope: Deactivated successfully.
Oct 10 05:45:50 np0005479823 podman[78475]: 2025-10-10 09:45:50.976350718 +0000 UTC m=+0.045805816 container create 89a361a487131c128919cc0cec5a721c2469a4e063b5b69562c474c2779021b6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_chatterjee, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 05:45:51 np0005479823 systemd[1]: Started libpod-conmon-89a361a487131c128919cc0cec5a721c2469a4e063b5b69562c474c2779021b6.scope.
Oct 10 05:45:51 np0005479823 systemd[1]: Started libcrun container.
Oct 10 05:45:51 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca3d902b2d5cbf721b701cf7bbd33d7e3830435f6319525d5e354f38ee4c5ced/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:51 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca3d902b2d5cbf721b701cf7bbd33d7e3830435f6319525d5e354f38ee4c5ced/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:51 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca3d902b2d5cbf721b701cf7bbd33d7e3830435f6319525d5e354f38ee4c5ced/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:51 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca3d902b2d5cbf721b701cf7bbd33d7e3830435f6319525d5e354f38ee4c5ced/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct 10 05:45:51 np0005479823 podman[78475]: 2025-10-10 09:45:50.957164659 +0000 UTC m=+0.026619797 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:45:51 np0005479823 podman[78475]: 2025-10-10 09:45:51.057774747 +0000 UTC m=+0.127229855 container init 89a361a487131c128919cc0cec5a721c2469a4e063b5b69562c474c2779021b6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_chatterjee, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 05:45:51 np0005479823 podman[78475]: 2025-10-10 09:45:51.063789658 +0000 UTC m=+0.133244746 container start 89a361a487131c128919cc0cec5a721c2469a4e063b5b69562c474c2779021b6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_chatterjee, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 05:45:51 np0005479823 podman[78475]: 2025-10-10 09:45:51.067440929 +0000 UTC m=+0.136896067 container attach 89a361a487131c128919cc0cec5a721c2469a4e063b5b69562c474c2779021b6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_chatterjee, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 05:45:51 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:51 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:51 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:51 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:51 np0005479823 ceph-mon[74913]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Oct 10 05:45:51 np0005479823 ceph-mon[74913]: from='osd.2 [v2:192.168.122.102:6800/269809354,v1:192.168.122.102:6801/269809354]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Oct 10 05:45:51 np0005479823 ceph-mon[74913]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Oct 10 05:45:51 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct 10 05:45:51 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct 10 05:45:51 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct 10 05:45:51 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct 10 05:45:51 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]: [
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:    {
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:        "available": false,
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:        "being_replaced": false,
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:        "ceph_device_lvm": false,
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:        "device_id": "QEMU_DVD-ROM_QM00001",
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:        "lsm_data": {},
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:        "lvs": [],
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:        "path": "/dev/sr0",
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:        "rejected_reasons": [
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:            "Insufficient space (<5GB)",
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:            "Has a FileSystem"
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:        ],
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:        "sys_api": {
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:            "actuators": null,
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:            "device_nodes": [
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:                "sr0"
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:            ],
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:            "devname": "sr0",
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:            "human_readable_size": "482.00 KB",
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:            "id_bus": "ata",
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:            "model": "QEMU DVD-ROM",
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:            "nr_requests": "2",
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:            "parent": "/dev/sr0",
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:            "partitions": {},
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:            "path": "/dev/sr0",
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:            "removable": "1",
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:            "rev": "2.5+",
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:            "ro": "0",
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:            "rotational": "0",
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:            "sas_address": "",
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:            "sas_device_handle": "",
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:            "scheduler_mode": "mq-deadline",
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:            "sectors": 0,
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:            "sectorsize": "2048",
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:            "size": 493568.0,
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:            "support_discard": "2048",
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:            "type": "disk",
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:            "vendor": "QEMU"
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:        }
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]:    }
Oct 10 05:45:51 np0005479823 zealous_chatterjee[78491]: ]
Oct 10 05:45:51 np0005479823 systemd[1]: libpod-89a361a487131c128919cc0cec5a721c2469a4e063b5b69562c474c2779021b6.scope: Deactivated successfully.
Oct 10 05:45:51 np0005479823 podman[78475]: 2025-10-10 09:45:51.725544179 +0000 UTC m=+0.794999267 container died 89a361a487131c128919cc0cec5a721c2469a4e063b5b69562c474c2779021b6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_chatterjee, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Oct 10 05:45:51 np0005479823 systemd[1]: var-lib-containers-storage-overlay-ca3d902b2d5cbf721b701cf7bbd33d7e3830435f6319525d5e354f38ee4c5ced-merged.mount: Deactivated successfully.
Oct 10 05:45:51 np0005479823 podman[78475]: 2025-10-10 09:45:51.772534864 +0000 UTC m=+0.841989982 container remove 89a361a487131c128919cc0cec5a721c2469a4e063b5b69562c474c2779021b6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_chatterjee, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 05:45:51 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e32 e32: 3 total, 2 up, 3 in
Oct 10 05:45:51 np0005479823 ceph-osd[77423]: osd.2 0 done with init, starting boot process
Oct 10 05:45:51 np0005479823 ceph-osd[77423]: osd.2 0 start_boot
Oct 10 05:45:51 np0005479823 ceph-osd[77423]: osd.2 0 maybe_override_options_for_qos osd_max_backfills set to 1
Oct 10 05:45:51 np0005479823 ceph-osd[77423]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Oct 10 05:45:51 np0005479823 ceph-osd[77423]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Oct 10 05:45:51 np0005479823 ceph-osd[77423]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Oct 10 05:45:51 np0005479823 ceph-osd[77423]: osd.2 0  bench count 12288000 bsize 4 KiB
Oct 10 05:45:51 np0005479823 systemd[1]: libpod-conmon-89a361a487131c128919cc0cec5a721c2469a4e063b5b69562c474c2779021b6.scope: Deactivated successfully.
Oct 10 05:45:52 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/2122384607' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Oct 10 05:45:52 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/2122384607' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Oct 10 05:45:52 np0005479823 ceph-mon[74913]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]': finished
Oct 10 05:45:52 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Oct 10 05:45:52 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Oct 10 05:45:52 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Oct 10 05:45:52 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Oct 10 05:45:52 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Oct 10 05:45:52 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:52 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:52 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Oct 10 05:45:52 np0005479823 ceph-mon[74913]: Adjusting osd_memory_target on compute-2 to 128.0M
Oct 10 05:45:52 np0005479823 ceph-mon[74913]: Unable to set osd_memory_target on compute-2 to 134243532: error parsing value: Value '134243532' is below minimum 939524096
Oct 10 05:45:52 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 05:45:52 np0005479823 ceph-mon[74913]: Updating compute-0:/etc/ceph/ceph.conf
Oct 10 05:45:52 np0005479823 ceph-mon[74913]: Updating compute-1:/etc/ceph/ceph.conf
Oct 10 05:45:52 np0005479823 ceph-mon[74913]: Updating compute-2:/etc/ceph/ceph.conf
Oct 10 05:45:52 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e33 e33: 3 total, 2 up, 3 in
Oct 10 05:45:53 np0005479823 ceph-mon[74913]: Updating compute-2:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 05:45:53 np0005479823 ceph-mon[74913]: Updating compute-0:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 05:45:53 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/2975567301' entity='client.admin' 
Oct 10 05:45:53 np0005479823 ceph-mon[74913]: Updating compute-1:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 05:45:53 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:53 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:54 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:54 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:54 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:54 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:54 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:54 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 05:45:54 np0005479823 ceph-mon[74913]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Oct 10 05:45:54 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:54 np0005479823 ceph-mon[74913]: Saving service ingress.rgw.default spec with placement count:2
Oct 10 05:45:54 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:54 np0005479823 ceph-osd[77423]: osd.2 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 35.622 iops: 9119.334 elapsed_sec: 0.329
Oct 10 05:45:54 np0005479823 ceph-osd[77423]: log_channel(cluster) log [WRN] : OSD bench result of 9119.333889 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Oct 10 05:45:54 np0005479823 ceph-osd[77423]: osd.2 0 waiting for initial osdmap
Oct 10 05:45:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2[77419]: 2025-10-10T09:45:54.216+0000 7fcff926a640 -1 osd.2 0 waiting for initial osdmap
Oct 10 05:45:54 np0005479823 ceph-osd[77423]: osd.2 33 crush map has features 288514051259236352, adjusting msgr requires for clients
Oct 10 05:45:54 np0005479823 ceph-osd[77423]: osd.2 33 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Oct 10 05:45:54 np0005479823 ceph-osd[77423]: osd.2 33 crush map has features 3314933000852226048, adjusting msgr requires for osds
Oct 10 05:45:54 np0005479823 ceph-osd[77423]: osd.2 33 check_osdmap_features require_osd_release unknown -> squid
Oct 10 05:45:54 np0005479823 ceph-osd[77423]: osd.2 33 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Oct 10 05:45:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-osd-2[77419]: 2025-10-10T09:45:54.241+0000 7fcff4892640 -1 osd.2 33 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Oct 10 05:45:54 np0005479823 ceph-osd[77423]: osd.2 33 set_numa_affinity not setting numa affinity
Oct 10 05:45:54 np0005479823 ceph-osd[77423]: osd.2 33 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial no unique device path for loop3: no symlink to loop3 in /dev/disk/by-path
Oct 10 05:45:54 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e33 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 33 tick checking mon for new map
Oct 10 05:45:55 np0005479823 ceph-mon[74913]: OSD bench result of 9119.333889 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Oct 10 05:45:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e34 e34: 3 total, 3 up, 3 in
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 34 state: booting -> active
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[4.19( empty local-lis/les=0/0 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[5.1a( empty local-lis/les=0/0 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[3.1b( empty local-lis/les=0/0 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[4.1c( empty local-lis/les=0/0 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[3.1a( empty local-lis/les=0/0 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[6.1e( empty local-lis/les=0/0 n=0 ec=27/21 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[3.9( empty local-lis/les=0/0 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[5.e( empty local-lis/les=0/0 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[4.1d( empty local-lis/les=0/0 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[3.8( empty local-lis/les=0/0 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[3.1d( empty local-lis/les=0/0 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[4.3( empty local-lis/les=0/0 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[6.1( empty local-lis/les=0/0 n=0 ec=27/21 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[4.6( empty local-lis/les=0/0 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[5.4( empty local-lis/les=0/0 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[4.2( empty local-lis/les=0/0 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[4.1( empty local-lis/les=0/0 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[5.0( empty local-lis/les=0/0 n=0 ec=19/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[3.0( empty local-lis/les=0/0 n=0 ec=17/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[5.d( empty local-lis/les=0/0 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[5.b( empty local-lis/les=0/0 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[3.e( empty local-lis/les=0/0 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[4.9( empty local-lis/les=0/0 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[5.8( empty local-lis/les=0/0 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[3.11( empty local-lis/les=0/0 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[4.8( empty local-lis/les=0/0 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[4.15( empty local-lis/les=0/0 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[4.14( empty local-lis/les=0/0 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[6.17( empty local-lis/les=0/0 n=0 ec=27/21 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[5.12( empty local-lis/les=0/0 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[6.12( empty local-lis/les=0/0 n=0 ec=27/21 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[3.15( empty local-lis/les=0/0 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[5.13( empty local-lis/les=0/0 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[4.1f( empty local-lis/les=0/0 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[6.1c( empty local-lis/les=0/0 n=0 ec=27/21 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 34 pg[6.1b( empty local-lis/les=0/0 n=0 ec=27/21 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:56 np0005479823 ceph-mon[74913]: osd.2 [v2:192.168.122.102:6800/269809354,v1:192.168.122.102:6801/269809354] boot
Oct 10 05:45:56 np0005479823 ceph-mon[74913]: Saving service node-exporter spec with placement *
Oct 10 05:45:56 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:56 np0005479823 ceph-mon[74913]: Saving service grafana spec with placement compute-0;count:1
Oct 10 05:45:56 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:56 np0005479823 ceph-mon[74913]: Saving service prometheus spec with placement compute-0;count:1
Oct 10 05:45:56 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:56 np0005479823 ceph-mon[74913]: Saving service alertmanager spec with placement compute-0;count:1
Oct 10 05:45:56 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:56 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e35 e35: 3 total, 3 up, 3 in
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[6.17( empty local-lis/les=34/35 n=0 ec=27/21 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[5.8( empty local-lis/les=34/35 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[4.14( empty local-lis/les=34/35 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[3.e( empty local-lis/les=34/35 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[3.11( empty local-lis/les=34/35 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[5.b( empty local-lis/les=34/35 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[5.d( empty local-lis/les=34/35 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[4.8( empty local-lis/les=34/35 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[4.9( empty local-lis/les=34/35 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[3.0( empty local-lis/les=34/35 n=0 ec=17/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[4.2( empty local-lis/les=34/35 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[5.0( empty local-lis/les=34/35 n=0 ec=19/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[4.19( empty local-lis/les=34/35 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[6.1b( empty local-lis/les=34/35 n=0 ec=27/21 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[4.1( empty local-lis/les=34/35 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[5.1a( empty local-lis/les=34/35 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[3.1b( empty local-lis/les=34/35 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[4.6( empty local-lis/les=34/35 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[6.1e( empty local-lis/les=34/35 n=0 ec=27/21 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[5.e( empty local-lis/les=34/35 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[3.8( empty local-lis/les=34/35 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[4.1c( empty local-lis/les=34/35 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[4.1d( empty local-lis/les=34/35 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[3.1a( empty local-lis/les=34/35 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[4.15( empty local-lis/les=34/35 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[3.1d( empty local-lis/les=34/35 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[4.3( empty local-lis/les=34/35 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[3.9( empty local-lis/les=34/35 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[5.4( empty local-lis/les=34/35 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[5.12( empty local-lis/les=34/35 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[6.12( empty local-lis/les=34/35 n=0 ec=27/21 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[3.15( empty local-lis/les=34/35 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[6.1c( empty local-lis/les=34/35 n=0 ec=27/21 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[4.1f( empty local-lis/les=34/35 n=0 ec=25/18 lis/c=25/25 les/c/f=26/26/0 sis=34) [2] r=0 lpr=34 pi=[25,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[5.13( empty local-lis/les=34/35 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[6.1( empty local-lis/les=34/35 n=0 ec=27/21 lis/c=27/27 les/c/f=28/28/0 sis=34) [2] r=0 lpr=34 pi=[27,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 6.17 scrub starts
Oct 10 05:45:56 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 6.17 scrub ok
Oct 10 05:45:57 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:57 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/2898111592' entity='client.admin' 
Oct 10 05:45:57 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.14 scrub starts
Oct 10 05:45:57 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.14 scrub ok
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.15( empty local-lis/les=0/0 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.13( empty local-lis/les=0/0 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.18( empty local-lis/les=0/0 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.12( empty local-lis/les=0/0 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.f( empty local-lis/les=0/0 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.10( empty local-lis/les=0/0 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.d( empty local-lis/les=0/0 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.c( empty local-lis/les=0/0 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.b( empty local-lis/les=0/0 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.5( empty local-lis/les=0/0 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.a( empty local-lis/les=0/0 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.1d( empty local-lis/les=0/0 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.1c( empty local-lis/les=0/0 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.1b( empty local-lis/les=0/0 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.13( empty local-lis/les=34/35 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.18( empty local-lis/les=34/35 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.12( empty local-lis/les=34/35 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.15( empty local-lis/les=34/35 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.10( empty local-lis/les=34/35 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.f( empty local-lis/les=34/35 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.c( empty local-lis/les=34/35 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.5( empty local-lis/les=34/35 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.d( empty local-lis/les=34/35 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.b( empty local-lis/les=34/35 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.1c( empty local-lis/les=34/35 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.1d( empty local-lis/les=34/35 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.1b( empty local-lis/les=34/35 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 35 pg[2.a( empty local-lis/les=34/35 n=0 ec=23/16 lis/c=23/23 les/c/f=24/24/0 sis=34) [2] r=0 lpr=35 pi=[23,34)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:45:58 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:58 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:58 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/1237849469' entity='client.admin' 
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.e scrub starts
Oct 10 05:45:58 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.e scrub ok
Oct 10 05:45:59 np0005479823 ceph-mon[74913]: Reconfiguring mon.compute-0 (monmap changed)...
Oct 10 05:45:59 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Oct 10 05:45:59 np0005479823 ceph-mon[74913]: Reconfiguring daemon mon.compute-0 on compute-0
Oct 10 05:45:59 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:59 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:45:59 np0005479823 ceph-mon[74913]: Reconfiguring mgr.compute-0.xkdepb (monmap changed)...
Oct 10 05:45:59 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.xkdepb", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Oct 10 05:45:59 np0005479823 ceph-mon[74913]: Reconfiguring daemon mgr.compute-0.xkdepb on compute-0
Oct 10 05:45:59 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/3410162506' entity='client.admin' 
Oct 10 05:45:59 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e35 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:45:59 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.8 scrub starts
Oct 10 05:45:59 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.8 scrub ok
Oct 10 05:46:00 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:00 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:00 np0005479823 ceph-mon[74913]: Reconfiguring crash.compute-0 (monmap changed)...
Oct 10 05:46:00 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Oct 10 05:46:00 np0005479823 ceph-mon[74913]: Reconfiguring daemon crash.compute-0 on compute-0
Oct 10 05:46:00 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.11 scrub starts
Oct 10 05:46:00 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.11 scrub ok
Oct 10 05:46:01 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:01 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:01 np0005479823 ceph-mon[74913]: Reconfiguring osd.0 (monmap changed)...
Oct 10 05:46:01 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Oct 10 05:46:01 np0005479823 ceph-mon[74913]: Reconfiguring daemon osd.0 on compute-0
Oct 10 05:46:01 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/2517476288' entity='client.admin' 
Oct 10 05:46:01 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:01 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:01 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Oct 10 05:46:01 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.b scrub starts
Oct 10 05:46:01 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.b scrub ok
Oct 10 05:46:02 np0005479823 ceph-mon[74913]: Reconfiguring crash.compute-1 (monmap changed)...
Oct 10 05:46:02 np0005479823 ceph-mon[74913]: Reconfiguring daemon crash.compute-1 on compute-1
Oct 10 05:46:02 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:02 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:02 np0005479823 ceph-mon[74913]: Reconfiguring osd.1 (monmap changed)...
Oct 10 05:46:02 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Oct 10 05:46:02 np0005479823 ceph-mon[74913]: Reconfiguring daemon osd.1 on compute-1
Oct 10 05:46:02 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command([{prefix=config set, name=mgr/dashboard/compute-1.rfugxc/server_addr}] v 0)
Oct 10 05:46:02 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.d scrub starts
Oct 10 05:46:02 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.d scrub ok
Oct 10 05:46:03 np0005479823 python3[80163]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a -f 'name=ceph-?(.*)-mgr.*' --format \{\{\.Command\}\} --no-trunc#012 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:46:03 np0005479823 ceph-mon[74913]: from='client.? ' entity='client.admin' 
Oct 10 05:46:03 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:03 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:03 np0005479823 ceph-mon[74913]: Reconfiguring mon.compute-1 (monmap changed)...
Oct 10 05:46:03 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Oct 10 05:46:03 np0005479823 ceph-mon[74913]: Reconfiguring daemon mon.compute-1 on compute-1
Oct 10 05:46:03 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.8 scrub starts
Oct 10 05:46:03 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.8 scrub ok
Oct 10 05:46:04 np0005479823 podman[80242]: 2025-10-10 09:46:04.151213714 +0000 UTC m=+0.043412166 container create 1ae58b0b261c7e2052c7e4a9bdf94ca925a649fa25f78a985c18180f3c9ed09d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_snyder, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Oct 10 05:46:04 np0005479823 systemd[1]: Started libpod-conmon-1ae58b0b261c7e2052c7e4a9bdf94ca925a649fa25f78a985c18180f3c9ed09d.scope.
Oct 10 05:46:04 np0005479823 systemd[1]: Started libcrun container.
Oct 10 05:46:04 np0005479823 podman[80242]: 2025-10-10 09:46:04.132215192 +0000 UTC m=+0.024413664 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:46:04 np0005479823 podman[80242]: 2025-10-10 09:46:04.230520164 +0000 UTC m=+0.122718646 container init 1ae58b0b261c7e2052c7e4a9bdf94ca925a649fa25f78a985c18180f3c9ed09d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_snyder, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 05:46:04 np0005479823 podman[80242]: 2025-10-10 09:46:04.2367348 +0000 UTC m=+0.128933252 container start 1ae58b0b261c7e2052c7e4a9bdf94ca925a649fa25f78a985c18180f3c9ed09d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_snyder, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 05:46:04 np0005479823 podman[80242]: 2025-10-10 09:46:04.240122783 +0000 UTC m=+0.132321235 container attach 1ae58b0b261c7e2052c7e4a9bdf94ca925a649fa25f78a985c18180f3c9ed09d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_snyder, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid)
Oct 10 05:46:04 np0005479823 sharp_snyder[80258]: 167 167
Oct 10 05:46:04 np0005479823 systemd[1]: libpod-1ae58b0b261c7e2052c7e4a9bdf94ca925a649fa25f78a985c18180f3c9ed09d.scope: Deactivated successfully.
Oct 10 05:46:04 np0005479823 podman[80242]: 2025-10-10 09:46:04.242044806 +0000 UTC m=+0.134243268 container died 1ae58b0b261c7e2052c7e4a9bdf94ca925a649fa25f78a985c18180f3c9ed09d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_snyder, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 05:46:04 np0005479823 systemd[1]: var-lib-containers-storage-overlay-c48fc53ebcfe2d7a68661659fca09026f77cc45ab364dd0da4eeb878447a5794-merged.mount: Deactivated successfully.
Oct 10 05:46:04 np0005479823 podman[80242]: 2025-10-10 09:46:04.278104697 +0000 UTC m=+0.170303149 container remove 1ae58b0b261c7e2052c7e4a9bdf94ca925a649fa25f78a985c18180f3c9ed09d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=sharp_snyder, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 05:46:04 np0005479823 systemd[1]: libpod-conmon-1ae58b0b261c7e2052c7e4a9bdf94ca925a649fa25f78a985c18180f3c9ed09d.scope: Deactivated successfully.
Oct 10 05:46:04 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:04 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:04 np0005479823 ceph-mon[74913]: Reconfiguring mon.compute-2 (monmap changed)...
Oct 10 05:46:04 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Oct 10 05:46:04 np0005479823 ceph-mon[74913]: Reconfiguring daemon mon.compute-2 on compute-2
Oct 10 05:46:04 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/699590867' entity='client.admin' 
Oct 10 05:46:04 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:04 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:04 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.gkrssp", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Oct 10 05:46:04 np0005479823 podman[80343]: 2025-10-10 09:46:04.724421669 +0000 UTC m=+0.035054177 container create 74494c38eb88fd4cb6e92d3690a50dc46ea128a170bb1f4a518a611a6f239c32 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_heyrovsky, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Oct 10 05:46:04 np0005479823 systemd[1]: Started libpod-conmon-74494c38eb88fd4cb6e92d3690a50dc46ea128a170bb1f4a518a611a6f239c32.scope.
Oct 10 05:46:04 np0005479823 systemd[1]: Started libcrun container.
Oct 10 05:46:04 np0005479823 podman[80343]: 2025-10-10 09:46:04.797022615 +0000 UTC m=+0.107655143 container init 74494c38eb88fd4cb6e92d3690a50dc46ea128a170bb1f4a518a611a6f239c32 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_heyrovsky, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 05:46:04 np0005479823 podman[80343]: 2025-10-10 09:46:04.80376459 +0000 UTC m=+0.114397098 container start 74494c38eb88fd4cb6e92d3690a50dc46ea128a170bb1f4a518a611a6f239c32 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_heyrovsky, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 05:46:04 np0005479823 podman[80343]: 2025-10-10 09:46:04.70911696 +0000 UTC m=+0.019749488 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:46:04 np0005479823 podman[80343]: 2025-10-10 09:46:04.807382581 +0000 UTC m=+0.118015119 container attach 74494c38eb88fd4cb6e92d3690a50dc46ea128a170bb1f4a518a611a6f239c32 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_heyrovsky, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 05:46:04 np0005479823 nostalgic_heyrovsky[80359]: 167 167
Oct 10 05:46:04 np0005479823 systemd[1]: libpod-74494c38eb88fd4cb6e92d3690a50dc46ea128a170bb1f4a518a611a6f239c32.scope: Deactivated successfully.
Oct 10 05:46:04 np0005479823 podman[80343]: 2025-10-10 09:46:04.812428158 +0000 UTC m=+0.123060676 container died 74494c38eb88fd4cb6e92d3690a50dc46ea128a170bb1f4a518a611a6f239c32 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_heyrovsky, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Oct 10 05:46:04 np0005479823 systemd[1]: var-lib-containers-storage-overlay-41cc191df743073c417e0eb5be5c6f6640f1aee3d02466fd2ff204cb5e998fe6-merged.mount: Deactivated successfully.
Oct 10 05:46:04 np0005479823 podman[80343]: 2025-10-10 09:46:04.844167864 +0000 UTC m=+0.154800372 container remove 74494c38eb88fd4cb6e92d3690a50dc46ea128a170bb1f4a518a611a6f239c32 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=nostalgic_heyrovsky, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Oct 10 05:46:04 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e35 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:46:04 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.9 scrub starts
Oct 10 05:46:04 np0005479823 systemd[1]: libpod-conmon-74494c38eb88fd4cb6e92d3690a50dc46ea128a170bb1f4a518a611a6f239c32.scope: Deactivated successfully.
Oct 10 05:46:04 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.9 scrub ok
Oct 10 05:46:05 np0005479823 ceph-mon[74913]: Reconfiguring mgr.compute-2.gkrssp (monmap changed)...
Oct 10 05:46:05 np0005479823 ceph-mon[74913]: Reconfiguring daemon mgr.compute-2.gkrssp on compute-2
Oct 10 05:46:05 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:05 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:05 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/1171706134' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Oct 10 05:46:05 np0005479823 podman[80497]: 2025-10-10 09:46:05.521096742 +0000 UTC m=+0.053004725 container exec bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Oct 10 05:46:05 np0005479823 podman[80497]: 2025-10-10 09:46:05.609128011 +0000 UTC m=+0.141035964 container exec_died bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 05:46:05 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.0 scrub starts
Oct 10 05:46:05 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.0 scrub ok
Oct 10 05:46:06 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:06 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:06 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/1171706134' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Oct 10 05:46:06 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:06 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:06 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.0 scrub starts
Oct 10 05:46:06 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.0 scrub ok
Oct 10 05:46:07 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 05:46:07 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:07 np0005479823 ceph-mon[74913]: from='mgr.14122 192.168.122.100:0/2212424954' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 05:46:07 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/520827948' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Oct 10 05:46:07 np0005479823 ceph-mgr[75218]: mgr handle_mgr_map respawning because set of enabled modules changed!
Oct 10 05:46:07 np0005479823 ceph-mgr[75218]: mgr respawn  e: '/usr/bin/ceph-mgr'
Oct 10 05:46:07 np0005479823 ceph-mgr[75218]: mgr respawn  0: '/usr/bin/ceph-mgr'
Oct 10 05:46:07 np0005479823 ceph-mgr[75218]: mgr respawn  1: '-n'
Oct 10 05:46:07 np0005479823 ceph-mgr[75218]: mgr respawn  2: 'mgr.compute-2.gkrssp'
Oct 10 05:46:07 np0005479823 ceph-mgr[75218]: mgr respawn  3: '-f'
Oct 10 05:46:07 np0005479823 ceph-mgr[75218]: mgr respawn  4: '--setuser'
Oct 10 05:46:07 np0005479823 ceph-mgr[75218]: mgr respawn  5: 'ceph'
Oct 10 05:46:07 np0005479823 ceph-mgr[75218]: mgr respawn  6: '--setgroup'
Oct 10 05:46:07 np0005479823 ceph-mgr[75218]: mgr respawn  7: 'ceph'
Oct 10 05:46:07 np0005479823 ceph-mgr[75218]: mgr respawn  8: '--default-log-to-file=false'
Oct 10 05:46:07 np0005479823 ceph-mgr[75218]: mgr respawn  9: '--default-log-to-journald=true'
Oct 10 05:46:07 np0005479823 ceph-mgr[75218]: mgr respawn  10: '--default-log-to-stderr=false'
Oct 10 05:46:07 np0005479823 ceph-mgr[75218]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Oct 10 05:46:07 np0005479823 ceph-mgr[75218]: mgr respawn  exe_path /proc/self/exe
Oct 10 05:46:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: ignoring --setuser ceph since I am not root
Oct 10 05:46:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: ignoring --setgroup ceph since I am not root
Oct 10 05:46:07 np0005479823 systemd[1]: session-27.scope: Deactivated successfully.
Oct 10 05:46:07 np0005479823 systemd-logind[796]: Session 27 logged out. Waiting for processes to exit.
Oct 10 05:46:07 np0005479823 systemd[1]: session-32.scope: Deactivated successfully.
Oct 10 05:46:07 np0005479823 systemd[1]: session-26.scope: Deactivated successfully.
Oct 10 05:46:07 np0005479823 systemd[1]: session-25.scope: Deactivated successfully.
Oct 10 05:46:07 np0005479823 systemd[1]: session-21.scope: Deactivated successfully.
Oct 10 05:46:07 np0005479823 systemd[1]: session-29.scope: Deactivated successfully.
Oct 10 05:46:07 np0005479823 systemd[1]: session-23.scope: Deactivated successfully.
Oct 10 05:46:07 np0005479823 systemd[1]: session-28.scope: Deactivated successfully.
Oct 10 05:46:07 np0005479823 systemd[1]: session-31.scope: Deactivated successfully.
Oct 10 05:46:07 np0005479823 systemd-logind[796]: Session 32 logged out. Waiting for processes to exit.
Oct 10 05:46:07 np0005479823 systemd[1]: session-30.scope: Deactivated successfully.
Oct 10 05:46:07 np0005479823 systemd-logind[796]: Session 26 logged out. Waiting for processes to exit.
Oct 10 05:46:07 np0005479823 systemd-logind[796]: Session 21 logged out. Waiting for processes to exit.
Oct 10 05:46:07 np0005479823 systemd[1]: session-24.scope: Deactivated successfully.
Oct 10 05:46:07 np0005479823 systemd-logind[796]: Session 23 logged out. Waiting for processes to exit.
Oct 10 05:46:07 np0005479823 systemd[1]: session-33.scope: Deactivated successfully.
Oct 10 05:46:07 np0005479823 systemd[1]: session-33.scope: Consumed 1min 1.779s CPU time.
Oct 10 05:46:07 np0005479823 systemd-logind[796]: Session 29 logged out. Waiting for processes to exit.
Oct 10 05:46:07 np0005479823 systemd-logind[796]: Session 28 logged out. Waiting for processes to exit.
Oct 10 05:46:07 np0005479823 systemd-logind[796]: Session 25 logged out. Waiting for processes to exit.
Oct 10 05:46:07 np0005479823 ceph-mgr[75218]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Oct 10 05:46:07 np0005479823 systemd-logind[796]: Session 31 logged out. Waiting for processes to exit.
Oct 10 05:46:07 np0005479823 ceph-mgr[75218]: pidfile_write: ignore empty --pid-file
Oct 10 05:46:07 np0005479823 systemd-logind[796]: Session 30 logged out. Waiting for processes to exit.
Oct 10 05:46:07 np0005479823 systemd-logind[796]: Session 24 logged out. Waiting for processes to exit.
Oct 10 05:46:07 np0005479823 systemd-logind[796]: Session 33 logged out. Waiting for processes to exit.
Oct 10 05:46:07 np0005479823 systemd-logind[796]: Removed session 27.
Oct 10 05:46:07 np0005479823 systemd-logind[796]: Removed session 32.
Oct 10 05:46:07 np0005479823 systemd-logind[796]: Removed session 26.
Oct 10 05:46:07 np0005479823 systemd-logind[796]: Removed session 25.
Oct 10 05:46:07 np0005479823 systemd-logind[796]: Removed session 21.
Oct 10 05:46:07 np0005479823 systemd-logind[796]: Removed session 29.
Oct 10 05:46:07 np0005479823 systemd-logind[796]: Removed session 23.
Oct 10 05:46:07 np0005479823 systemd-logind[796]: Removed session 28.
Oct 10 05:46:07 np0005479823 systemd-logind[796]: Removed session 31.
Oct 10 05:46:07 np0005479823 systemd-logind[796]: Removed session 30.
Oct 10 05:46:07 np0005479823 systemd-logind[796]: Removed session 24.
Oct 10 05:46:07 np0005479823 systemd-logind[796]: Removed session 33.
Oct 10 05:46:07 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'alerts'
Oct 10 05:46:07 np0005479823 ceph-mgr[75218]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Oct 10 05:46:07 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'balancer'
Oct 10 05:46:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:07.759+0000 7f3d49c77140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Oct 10 05:46:07 np0005479823 ceph-mgr[75218]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Oct 10 05:46:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:07.834+0000 7f3d49c77140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Oct 10 05:46:07 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'cephadm'
Oct 10 05:46:07 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.2 scrub starts
Oct 10 05:46:07 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.2 scrub ok
Oct 10 05:46:08 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/520827948' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Oct 10 05:46:08 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'crash'
Oct 10 05:46:08 np0005479823 ceph-mgr[75218]: mgr[py] Module crash has missing NOTIFY_TYPES member
Oct 10 05:46:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:08.591+0000 7f3d49c77140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Oct 10 05:46:08 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'dashboard'
Oct 10 05:46:08 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.19 scrub starts
Oct 10 05:46:08 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.19 scrub ok
Oct 10 05:46:09 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'devicehealth'
Oct 10 05:46:09 np0005479823 ceph-mgr[75218]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Oct 10 05:46:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:09.222+0000 7f3d49c77140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Oct 10 05:46:09 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'diskprediction_local'
Oct 10 05:46:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Oct 10 05:46:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Oct 10 05:46:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]:  from numpy import show_config as show_numpy_config
Oct 10 05:46:09 np0005479823 ceph-mgr[75218]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Oct 10 05:46:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:09.385+0000 7f3d49c77140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Oct 10 05:46:09 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'influx'
Oct 10 05:46:09 np0005479823 ceph-mgr[75218]: mgr[py] Module influx has missing NOTIFY_TYPES member
Oct 10 05:46:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:09.460+0000 7f3d49c77140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Oct 10 05:46:09 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'insights'
Oct 10 05:46:09 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'iostat'
Oct 10 05:46:09 np0005479823 ceph-mgr[75218]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Oct 10 05:46:09 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'k8sevents'
Oct 10 05:46:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:09.597+0000 7f3d49c77140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Oct 10 05:46:09 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.1a scrub starts
Oct 10 05:46:09 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.1a scrub ok
Oct 10 05:46:09 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e35 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:46:09 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'localpool'
Oct 10 05:46:10 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'mds_autoscaler'
Oct 10 05:46:10 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'mirroring'
Oct 10 05:46:10 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'nfs'
Oct 10 05:46:10 np0005479823 ceph-mgr[75218]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Oct 10 05:46:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:10.570+0000 7f3d49c77140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Oct 10 05:46:10 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'orchestrator'
Oct 10 05:46:10 np0005479823 ceph-mgr[75218]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Oct 10 05:46:10 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'osd_perf_query'
Oct 10 05:46:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:10.787+0000 7f3d49c77140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Oct 10 05:46:10 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.1 scrub starts
Oct 10 05:46:10 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.1 scrub ok
Oct 10 05:46:10 np0005479823 ceph-mgr[75218]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Oct 10 05:46:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:10.865+0000 7f3d49c77140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Oct 10 05:46:10 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'osd_support'
Oct 10 05:46:10 np0005479823 ceph-mgr[75218]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Oct 10 05:46:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:10.927+0000 7f3d49c77140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Oct 10 05:46:10 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'pg_autoscaler'
Oct 10 05:46:11 np0005479823 ceph-mgr[75218]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Oct 10 05:46:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:11.001+0000 7f3d49c77140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Oct 10 05:46:11 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'progress'
Oct 10 05:46:11 np0005479823 ceph-mgr[75218]: mgr[py] Module progress has missing NOTIFY_TYPES member
Oct 10 05:46:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:11.079+0000 7f3d49c77140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Oct 10 05:46:11 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'prometheus'
Oct 10 05:46:11 np0005479823 ceph-mgr[75218]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Oct 10 05:46:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:11.433+0000 7f3d49c77140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Oct 10 05:46:11 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'rbd_support'
Oct 10 05:46:11 np0005479823 ceph-mgr[75218]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Oct 10 05:46:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:11.532+0000 7f3d49c77140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Oct 10 05:46:11 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'restful'
Oct 10 05:46:11 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'rgw'
Oct 10 05:46:11 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 6.1b scrub starts
Oct 10 05:46:11 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 6.1b scrub ok
Oct 10 05:46:11 np0005479823 ceph-mgr[75218]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Oct 10 05:46:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:11.965+0000 7f3d49c77140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Oct 10 05:46:11 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'rook'
Oct 10 05:46:12 np0005479823 ceph-mgr[75218]: mgr[py] Module rook has missing NOTIFY_TYPES member
Oct 10 05:46:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:12.535+0000 7f3d49c77140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Oct 10 05:46:12 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'selftest'
Oct 10 05:46:12 np0005479823 ceph-mgr[75218]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Oct 10 05:46:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:12.607+0000 7f3d49c77140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Oct 10 05:46:12 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'snap_schedule'
Oct 10 05:46:12 np0005479823 ceph-mgr[75218]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Oct 10 05:46:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:12.686+0000 7f3d49c77140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Oct 10 05:46:12 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'stats'
Oct 10 05:46:12 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'status'
Oct 10 05:46:12 np0005479823 ceph-mgr[75218]: mgr[py] Module status has missing NOTIFY_TYPES member
Oct 10 05:46:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:12.834+0000 7f3d49c77140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Oct 10 05:46:12 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'telegraf'
Oct 10 05:46:12 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.6 scrub starts
Oct 10 05:46:12 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.6 scrub ok
Oct 10 05:46:12 np0005479823 ceph-mgr[75218]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Oct 10 05:46:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:12.906+0000 7f3d49c77140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Oct 10 05:46:12 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'telemetry'
Oct 10 05:46:13 np0005479823 ceph-mgr[75218]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Oct 10 05:46:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:13.058+0000 7f3d49c77140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Oct 10 05:46:13 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'test_orchestrator'
Oct 10 05:46:13 np0005479823 ceph-mgr[75218]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Oct 10 05:46:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:13.275+0000 7f3d49c77140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Oct 10 05:46:13 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'volumes'
Oct 10 05:46:13 np0005479823 ceph-mgr[75218]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Oct 10 05:46:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:13.551+0000 7f3d49c77140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Oct 10 05:46:13 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'zabbix'
Oct 10 05:46:13 np0005479823 ceph-mgr[75218]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Oct 10 05:46:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:13.648+0000 7f3d49c77140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Oct 10 05:46:13 np0005479823 ceph-mgr[75218]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Oct 10 05:46:13 np0005479823 ceph-mgr[75218]: mgr load Constructed class from module: dashboard
Oct 10 05:46:13 np0005479823 ceph-mgr[75218]: [dashboard INFO root] server: ssl=no host=192.168.122.102 port=8443
Oct 10 05:46:13 np0005479823 ceph-mgr[75218]: [dashboard INFO root] Configured CherryPy, starting engine...
Oct 10 05:46:13 np0005479823 ceph-mgr[75218]: [dashboard INFO root] Starting engine...
Oct 10 05:46:13 np0005479823 ceph-mgr[75218]: ms_deliver_dispatch: unhandled message 0x56304770f860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Oct 10 05:46:13 np0005479823 ceph-mgr[75218]: [dashboard INFO root] Engine started...
Oct 10 05:46:13 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e36 e36: 3 total, 3 up, 3 in
Oct 10 05:46:13 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.1b scrub starts
Oct 10 05:46:13 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.1b scrub ok
Oct 10 05:46:14 np0005479823 systemd-logind[796]: New session 34 of user ceph-admin.
Oct 10 05:46:14 np0005479823 systemd[1]: Started Session 34 of User ceph-admin.
Oct 10 05:46:14 np0005479823 ceph-mon[74913]: Active manager daemon compute-0.xkdepb restarted
Oct 10 05:46:14 np0005479823 ceph-mon[74913]: Activating manager daemon compute-0.xkdepb
Oct 10 05:46:14 np0005479823 ceph-mon[74913]: Manager daemon compute-0.xkdepb is now available
Oct 10 05:46:14 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.xkdepb/mirror_snapshot_schedule"}]: dispatch
Oct 10 05:46:14 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.xkdepb/trash_purge_schedule"}]: dispatch
Oct 10 05:46:14 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e36 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:46:14 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.e deep-scrub starts
Oct 10 05:46:14 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.e deep-scrub ok
Oct 10 05:46:15 np0005479823 podman[80833]: 2025-10-10 09:46:15.063412384 +0000 UTC m=+0.054116172 container exec bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Oct 10 05:46:15 np0005479823 podman[80833]: 2025-10-10 09:46:15.192064715 +0000 UTC m=+0.182768483 container exec_died bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Oct 10 05:46:15 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:15 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.1c scrub starts
Oct 10 05:46:15 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.1c scrub ok
Oct 10 05:46:16 np0005479823 ceph-mon[74913]: [10/Oct/2025:09:46:15] ENGINE Bus STARTING
Oct 10 05:46:16 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:16 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:16 np0005479823 ceph-mon[74913]: [10/Oct/2025:09:46:15] ENGINE Serving on https://192.168.122.100:7150
Oct 10 05:46:16 np0005479823 ceph-mon[74913]: [10/Oct/2025:09:46:15] ENGINE Client ('192.168.122.100', 44336) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Oct 10 05:46:16 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:16 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:16 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:16 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:16 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:16 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 6.1e scrub starts
Oct 10 05:46:16 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 6.1e scrub ok
Oct 10 05:46:17 np0005479823 ceph-mon[74913]: [10/Oct/2025:09:46:15] ENGINE Serving on http://192.168.122.100:8765
Oct 10 05:46:17 np0005479823 ceph-mon[74913]: [10/Oct/2025:09:46:15] ENGINE Bus STARTED
Oct 10 05:46:17 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:17 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:17 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Oct 10 05:46:17 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:17 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:17 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Oct 10 05:46:17 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:17 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:17 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:17 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Oct 10 05:46:17 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 05:46:17 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.1a scrub starts
Oct 10 05:46:17 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.1a scrub ok
Oct 10 05:46:18 np0005479823 ceph-mon[74913]: Adjusting osd_memory_target on compute-2 to 128.0M
Oct 10 05:46:18 np0005479823 ceph-mon[74913]: Unable to set osd_memory_target on compute-2 to 134243532: error parsing value: Value '134243532' is below minimum 939524096
Oct 10 05:46:18 np0005479823 ceph-mon[74913]: Adjusting osd_memory_target on compute-0 to 128.0M
Oct 10 05:46:18 np0005479823 ceph-mon[74913]: Unable to set osd_memory_target on compute-0 to 134240665: error parsing value: Value '134240665' is below minimum 939524096
Oct 10 05:46:18 np0005479823 ceph-mon[74913]: Adjusting osd_memory_target on compute-1 to 128.0M
Oct 10 05:46:18 np0005479823 ceph-mon[74913]: Unable to set osd_memory_target on compute-1 to 134243532: error parsing value: Value '134243532' is below minimum 939524096
Oct 10 05:46:18 np0005479823 ceph-mon[74913]: Updating compute-0:/etc/ceph/ceph.conf
Oct 10 05:46:18 np0005479823 ceph-mon[74913]: Updating compute-1:/etc/ceph/ceph.conf
Oct 10 05:46:18 np0005479823 ceph-mon[74913]: Updating compute-2:/etc/ceph/ceph.conf
Oct 10 05:46:18 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:18 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.15 scrub starts
Oct 10 05:46:18 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.15 scrub ok
Oct 10 05:46:19 np0005479823 ceph-mon[74913]: Updating compute-0:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 05:46:19 np0005479823 ceph-mon[74913]: Updating compute-2:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 05:46:19 np0005479823 ceph-mon[74913]: Updating compute-1:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 05:46:19 np0005479823 ceph-mon[74913]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Oct 10 05:46:19 np0005479823 ceph-mon[74913]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Oct 10 05:46:19 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:19 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:19 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:19 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:19 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:19 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.3 scrub starts
Oct 10 05:46:19 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e36 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:46:19 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.3 scrub ok
Oct 10 05:46:20 np0005479823 ceph-mgr[75218]: mgr handle_mgr_map respawning because set of enabled modules changed!
Oct 10 05:46:20 np0005479823 ceph-mgr[75218]: mgr respawn  e: '/usr/bin/ceph-mgr'
Oct 10 05:46:20 np0005479823 ceph-mgr[75218]: mgr respawn  0: '/usr/bin/ceph-mgr'
Oct 10 05:46:20 np0005479823 ceph-mgr[75218]: mgr respawn  1: '-n'
Oct 10 05:46:20 np0005479823 ceph-mgr[75218]: mgr respawn  2: 'mgr.compute-2.gkrssp'
Oct 10 05:46:20 np0005479823 ceph-mgr[75218]: mgr respawn  3: '-f'
Oct 10 05:46:20 np0005479823 ceph-mgr[75218]: mgr respawn  4: '--setuser'
Oct 10 05:46:20 np0005479823 ceph-mgr[75218]: mgr respawn  5: 'ceph'
Oct 10 05:46:20 np0005479823 ceph-mgr[75218]: mgr respawn  6: '--setgroup'
Oct 10 05:46:20 np0005479823 ceph-mgr[75218]: mgr respawn  7: 'ceph'
Oct 10 05:46:20 np0005479823 ceph-mgr[75218]: mgr respawn  8: '--default-log-to-file=false'
Oct 10 05:46:20 np0005479823 ceph-mgr[75218]: mgr respawn  9: '--default-log-to-journald=true'
Oct 10 05:46:20 np0005479823 ceph-mgr[75218]: mgr respawn  10: '--default-log-to-stderr=false'
Oct 10 05:46:20 np0005479823 ceph-mgr[75218]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Oct 10 05:46:20 np0005479823 ceph-mgr[75218]: mgr respawn  exe_path /proc/self/exe
Oct 10 05:46:20 np0005479823 ceph-mon[74913]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Oct 10 05:46:20 np0005479823 ceph-mon[74913]: Updating compute-2:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring
Oct 10 05:46:20 np0005479823 ceph-mon[74913]: Updating compute-0:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring
Oct 10 05:46:20 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/1314314115' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Oct 10 05:46:20 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:20 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:20 np0005479823 ceph-mon[74913]: from='mgr.14355 192.168.122.100:0/2142097187' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:20 np0005479823 systemd[1]: session-34.scope: Deactivated successfully.
Oct 10 05:46:20 np0005479823 systemd[1]: session-34.scope: Consumed 4.416s CPU time.
Oct 10 05:46:20 np0005479823 systemd-logind[796]: Session 34 logged out. Waiting for processes to exit.
Oct 10 05:46:20 np0005479823 systemd-logind[796]: Removed session 34.
Oct 10 05:46:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: ignoring --setuser ceph since I am not root
Oct 10 05:46:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: ignoring --setgroup ceph since I am not root
Oct 10 05:46:20 np0005479823 ceph-mgr[75218]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Oct 10 05:46:20 np0005479823 ceph-mgr[75218]: pidfile_write: ignore empty --pid-file
Oct 10 05:46:20 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'alerts'
Oct 10 05:46:20 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.1d scrub starts
Oct 10 05:46:20 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 4.1d scrub ok
Oct 10 05:46:20 np0005479823 ceph-mgr[75218]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Oct 10 05:46:20 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'balancer'
Oct 10 05:46:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:20.940+0000 7f9794936140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Oct 10 05:46:21 np0005479823 ceph-mgr[75218]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Oct 10 05:46:21 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'cephadm'
Oct 10 05:46:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:21.028+0000 7f9794936140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Oct 10 05:46:21 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/1314314115' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Oct 10 05:46:21 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/2158945969' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Oct 10 05:46:21 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'crash'
Oct 10 05:46:21 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.1d scrub starts
Oct 10 05:46:21 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.1d scrub ok
Oct 10 05:46:21 np0005479823 ceph-mgr[75218]: mgr[py] Module crash has missing NOTIFY_TYPES member
Oct 10 05:46:21 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'dashboard'
Oct 10 05:46:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:21.832+0000 7f9794936140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Oct 10 05:46:22 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'devicehealth'
Oct 10 05:46:22 np0005479823 ceph-mgr[75218]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Oct 10 05:46:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:22.490+0000 7f9794936140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Oct 10 05:46:22 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'diskprediction_local'
Oct 10 05:46:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Oct 10 05:46:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Oct 10 05:46:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]:  from numpy import show_config as show_numpy_config
Oct 10 05:46:22 np0005479823 ceph-mgr[75218]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Oct 10 05:46:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:22.653+0000 7f9794936140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Oct 10 05:46:22 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'influx'
Oct 10 05:46:22 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/2158945969' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Oct 10 05:46:22 np0005479823 ceph-mgr[75218]: mgr[py] Module influx has missing NOTIFY_TYPES member
Oct 10 05:46:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:22.724+0000 7f9794936140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Oct 10 05:46:22 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'insights'
Oct 10 05:46:22 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.9 deep-scrub starts
Oct 10 05:46:22 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'iostat'
Oct 10 05:46:22 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.9 deep-scrub ok
Oct 10 05:46:22 np0005479823 ceph-mgr[75218]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Oct 10 05:46:22 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'k8sevents'
Oct 10 05:46:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:22.869+0000 7f9794936140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Oct 10 05:46:23 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'localpool'
Oct 10 05:46:23 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'mds_autoscaler'
Oct 10 05:46:23 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'mirroring'
Oct 10 05:46:23 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'nfs'
Oct 10 05:46:23 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.12 scrub starts
Oct 10 05:46:23 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.12 scrub ok
Oct 10 05:46:23 np0005479823 ceph-mgr[75218]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Oct 10 05:46:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:23.871+0000 7f9794936140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Oct 10 05:46:23 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'orchestrator'
Oct 10 05:46:24 np0005479823 ceph-mgr[75218]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Oct 10 05:46:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:24.093+0000 7f9794936140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Oct 10 05:46:24 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'osd_perf_query'
Oct 10 05:46:24 np0005479823 ceph-mgr[75218]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Oct 10 05:46:24 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'osd_support'
Oct 10 05:46:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:24.169+0000 7f9794936140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Oct 10 05:46:24 np0005479823 ceph-mgr[75218]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Oct 10 05:46:24 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'pg_autoscaler'
Oct 10 05:46:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:24.232+0000 7f9794936140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Oct 10 05:46:24 np0005479823 ceph-mgr[75218]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Oct 10 05:46:24 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'progress'
Oct 10 05:46:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:24.313+0000 7f9794936140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Oct 10 05:46:24 np0005479823 ceph-mgr[75218]: mgr[py] Module progress has missing NOTIFY_TYPES member
Oct 10 05:46:24 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'prometheus'
Oct 10 05:46:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:24.383+0000 7f9794936140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Oct 10 05:46:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:24.715+0000 7f9794936140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Oct 10 05:46:24 np0005479823 ceph-mgr[75218]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Oct 10 05:46:24 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'rbd_support'
Oct 10 05:46:24 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.4 scrub starts
Oct 10 05:46:24 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.4 scrub ok
Oct 10 05:46:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:24.815+0000 7f9794936140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Oct 10 05:46:24 np0005479823 ceph-mgr[75218]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Oct 10 05:46:24 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'restful'
Oct 10 05:46:24 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e36 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:46:25 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'rgw'
Oct 10 05:46:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:25.270+0000 7f9794936140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Oct 10 05:46:25 np0005479823 ceph-mgr[75218]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Oct 10 05:46:25 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'rook'
Oct 10 05:46:25 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 6.12 deep-scrub starts
Oct 10 05:46:25 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 6.12 deep-scrub ok
Oct 10 05:46:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:25.833+0000 7f9794936140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Oct 10 05:46:25 np0005479823 ceph-mgr[75218]: mgr[py] Module rook has missing NOTIFY_TYPES member
Oct 10 05:46:25 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'selftest'
Oct 10 05:46:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:25.904+0000 7f9794936140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Oct 10 05:46:25 np0005479823 ceph-mgr[75218]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Oct 10 05:46:25 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'snap_schedule'
Oct 10 05:46:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:25.988+0000 7f9794936140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Oct 10 05:46:25 np0005479823 ceph-mgr[75218]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Oct 10 05:46:25 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'stats'
Oct 10 05:46:26 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'status'
Oct 10 05:46:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:26.136+0000 7f9794936140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Oct 10 05:46:26 np0005479823 ceph-mgr[75218]: mgr[py] Module status has missing NOTIFY_TYPES member
Oct 10 05:46:26 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'telegraf'
Oct 10 05:46:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:26.208+0000 7f9794936140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Oct 10 05:46:26 np0005479823 ceph-mgr[75218]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Oct 10 05:46:26 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'telemetry'
Oct 10 05:46:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:26.380+0000 7f9794936140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Oct 10 05:46:26 np0005479823 ceph-mgr[75218]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Oct 10 05:46:26 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'test_orchestrator'
Oct 10 05:46:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:26.596+0000 7f9794936140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Oct 10 05:46:26 np0005479823 ceph-mgr[75218]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Oct 10 05:46:26 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'volumes'
Oct 10 05:46:26 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 6.1c scrub starts
Oct 10 05:46:26 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 6.1c scrub ok
Oct 10 05:46:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:26.891+0000 7f9794936140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Oct 10 05:46:26 np0005479823 ceph-mgr[75218]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Oct 10 05:46:26 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'zabbix'
Oct 10 05:46:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:26.963+0000 7f9794936140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Oct 10 05:46:26 np0005479823 ceph-mgr[75218]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Oct 10 05:46:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e37 e37: 3 total, 3 up, 3 in
Oct 10 05:46:26 np0005479823 ceph-mgr[75218]: ms_deliver_dispatch: unhandled message 0x56089d6a9860 mon_map magic: 0 from mon.0 v2:192.168.122.100:3300/0
Oct 10 05:46:26 np0005479823 ceph-mgr[75218]: mgr handle_mgr_map respawning because set of enabled modules changed!
Oct 10 05:46:26 np0005479823 ceph-mgr[75218]: mgr respawn  e: '/usr/bin/ceph-mgr'
Oct 10 05:46:26 np0005479823 ceph-mgr[75218]: mgr respawn  0: '/usr/bin/ceph-mgr'
Oct 10 05:46:26 np0005479823 ceph-mgr[75218]: mgr respawn  1: '-n'
Oct 10 05:46:26 np0005479823 ceph-mgr[75218]: mgr respawn  2: 'mgr.compute-2.gkrssp'
Oct 10 05:46:26 np0005479823 ceph-mgr[75218]: mgr respawn  3: '-f'
Oct 10 05:46:26 np0005479823 ceph-mgr[75218]: mgr respawn  4: '--setuser'
Oct 10 05:46:26 np0005479823 ceph-mgr[75218]: mgr respawn  5: 'ceph'
Oct 10 05:46:26 np0005479823 ceph-mgr[75218]: mgr respawn  6: '--setgroup'
Oct 10 05:46:26 np0005479823 ceph-mgr[75218]: mgr respawn  7: 'ceph'
Oct 10 05:46:26 np0005479823 ceph-mgr[75218]: mgr respawn  8: '--default-log-to-file=false'
Oct 10 05:46:26 np0005479823 ceph-mgr[75218]: mgr respawn  9: '--default-log-to-journald=true'
Oct 10 05:46:26 np0005479823 ceph-mgr[75218]: mgr respawn  10: '--default-log-to-stderr=false'
Oct 10 05:46:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: ignoring --setuser ceph since I am not root
Oct 10 05:46:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: ignoring --setgroup ceph since I am not root
Oct 10 05:46:27 np0005479823 ceph-mgr[75218]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Oct 10 05:46:27 np0005479823 ceph-mgr[75218]: pidfile_write: ignore empty --pid-file
Oct 10 05:46:27 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'alerts'
Oct 10 05:46:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:27.194+0000 7f268f1ee140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Oct 10 05:46:27 np0005479823 ceph-mgr[75218]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Oct 10 05:46:27 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'balancer'
Oct 10 05:46:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:27.277+0000 7f268f1ee140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Oct 10 05:46:27 np0005479823 ceph-mgr[75218]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Oct 10 05:46:27 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'cephadm'
Oct 10 05:46:27 np0005479823 ceph-mon[74913]: Active manager daemon compute-0.xkdepb restarted
Oct 10 05:46:27 np0005479823 ceph-mon[74913]: Activating manager daemon compute-0.xkdepb
Oct 10 05:46:27 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.13 scrub starts
Oct 10 05:46:27 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 5.13 scrub ok
Oct 10 05:46:27 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'crash'
Oct 10 05:46:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:28.079+0000 7f268f1ee140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Oct 10 05:46:28 np0005479823 ceph-mgr[75218]: mgr[py] Module crash has missing NOTIFY_TYPES member
Oct 10 05:46:28 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'dashboard'
Oct 10 05:46:28 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'devicehealth'
Oct 10 05:46:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:28.704+0000 7f268f1ee140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Oct 10 05:46:28 np0005479823 ceph-mgr[75218]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Oct 10 05:46:28 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'diskprediction_local'
Oct 10 05:46:28 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 6.1 scrub starts
Oct 10 05:46:28 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 6.1 scrub ok
Oct 10 05:46:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Oct 10 05:46:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Oct 10 05:46:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]:  from numpy import show_config as show_numpy_config
Oct 10 05:46:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:28.885+0000 7f268f1ee140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Oct 10 05:46:28 np0005479823 ceph-mgr[75218]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Oct 10 05:46:28 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'influx'
Oct 10 05:46:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:28.957+0000 7f268f1ee140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Oct 10 05:46:28 np0005479823 ceph-mgr[75218]: mgr[py] Module influx has missing NOTIFY_TYPES member
Oct 10 05:46:28 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'insights'
Oct 10 05:46:29 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'iostat'
Oct 10 05:46:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:29.091+0000 7f268f1ee140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Oct 10 05:46:29 np0005479823 ceph-mgr[75218]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Oct 10 05:46:29 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'k8sevents'
Oct 10 05:46:29 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'localpool'
Oct 10 05:46:29 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'mds_autoscaler'
Oct 10 05:46:29 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'mirroring'
Oct 10 05:46:29 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.8 scrub starts
Oct 10 05:46:29 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 3.8 scrub ok
Oct 10 05:46:29 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e37 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:46:29 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'nfs'
Oct 10 05:46:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:30.091+0000 7f268f1ee140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Oct 10 05:46:30 np0005479823 ceph-mgr[75218]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Oct 10 05:46:30 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'orchestrator'
Oct 10 05:46:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:30.312+0000 7f268f1ee140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Oct 10 05:46:30 np0005479823 ceph-mgr[75218]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Oct 10 05:46:30 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'osd_perf_query'
Oct 10 05:46:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:30.386+0000 7f268f1ee140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Oct 10 05:46:30 np0005479823 ceph-mgr[75218]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Oct 10 05:46:30 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'osd_support'
Oct 10 05:46:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:30.455+0000 7f268f1ee140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Oct 10 05:46:30 np0005479823 ceph-mgr[75218]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Oct 10 05:46:30 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'pg_autoscaler'
Oct 10 05:46:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:30.538+0000 7f268f1ee140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Oct 10 05:46:30 np0005479823 ceph-mgr[75218]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Oct 10 05:46:30 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'progress'
Oct 10 05:46:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:30.610+0000 7f268f1ee140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Oct 10 05:46:30 np0005479823 ceph-mgr[75218]: mgr[py] Module progress has missing NOTIFY_TYPES member
Oct 10 05:46:30 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'prometheus'
Oct 10 05:46:30 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.15 scrub starts
Oct 10 05:46:30 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.15 scrub ok
Oct 10 05:46:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:30.940+0000 7f268f1ee140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Oct 10 05:46:30 np0005479823 ceph-mgr[75218]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Oct 10 05:46:30 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'rbd_support'
Oct 10 05:46:30 np0005479823 systemd[1]: Stopping User Manager for UID 42477...
Oct 10 05:46:30 np0005479823 systemd[71724]: Activating special unit Exit the Session...
Oct 10 05:46:30 np0005479823 systemd[71724]: Stopped target Main User Target.
Oct 10 05:46:30 np0005479823 systemd[71724]: Stopped target Basic System.
Oct 10 05:46:30 np0005479823 systemd[71724]: Stopped target Paths.
Oct 10 05:46:30 np0005479823 systemd[71724]: Stopped target Sockets.
Oct 10 05:46:30 np0005479823 systemd[71724]: Stopped target Timers.
Oct 10 05:46:30 np0005479823 systemd[71724]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct 10 05:46:30 np0005479823 systemd[71724]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 10 05:46:30 np0005479823 systemd[71724]: Closed D-Bus User Message Bus Socket.
Oct 10 05:46:30 np0005479823 systemd[71724]: Stopped Create User's Volatile Files and Directories.
Oct 10 05:46:30 np0005479823 systemd[71724]: Removed slice User Application Slice.
Oct 10 05:46:30 np0005479823 systemd[71724]: Reached target Shutdown.
Oct 10 05:46:30 np0005479823 systemd[71724]: Finished Exit the Session.
Oct 10 05:46:30 np0005479823 systemd[71724]: Reached target Exit the Session.
Oct 10 05:46:30 np0005479823 systemd[1]: user@42477.service: Deactivated successfully.
Oct 10 05:46:30 np0005479823 systemd[1]: Stopped User Manager for UID 42477.
Oct 10 05:46:30 np0005479823 systemd[1]: Stopping User Runtime Directory /run/user/42477...
Oct 10 05:46:31 np0005479823 systemd[1]: run-user-42477.mount: Deactivated successfully.
Oct 10 05:46:31 np0005479823 systemd[1]: user-runtime-dir@42477.service: Deactivated successfully.
Oct 10 05:46:31 np0005479823 systemd[1]: Stopped User Runtime Directory /run/user/42477.
Oct 10 05:46:31 np0005479823 systemd[1]: Removed slice User Slice of UID 42477.
Oct 10 05:46:31 np0005479823 systemd[1]: user-42477.slice: Consumed 1min 7.750s CPU time.
Oct 10 05:46:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:31.050+0000 7f268f1ee140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Oct 10 05:46:31 np0005479823 ceph-mgr[75218]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Oct 10 05:46:31 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'restful'
Oct 10 05:46:31 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'rgw'
Oct 10 05:46:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:31.492+0000 7f268f1ee140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Oct 10 05:46:31 np0005479823 ceph-mgr[75218]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Oct 10 05:46:31 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'rook'
Oct 10 05:46:31 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.12 deep-scrub starts
Oct 10 05:46:31 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.12 deep-scrub ok
Oct 10 05:46:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:32.097+0000 7f268f1ee140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Oct 10 05:46:32 np0005479823 ceph-mgr[75218]: mgr[py] Module rook has missing NOTIFY_TYPES member
Oct 10 05:46:32 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'selftest'
Oct 10 05:46:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:32.187+0000 7f268f1ee140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Oct 10 05:46:32 np0005479823 ceph-mgr[75218]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Oct 10 05:46:32 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'snap_schedule'
Oct 10 05:46:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:32.273+0000 7f268f1ee140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Oct 10 05:46:32 np0005479823 ceph-mgr[75218]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Oct 10 05:46:32 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'stats'
Oct 10 05:46:32 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'status'
Oct 10 05:46:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:32.418+0000 7f268f1ee140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Oct 10 05:46:32 np0005479823 ceph-mgr[75218]: mgr[py] Module status has missing NOTIFY_TYPES member
Oct 10 05:46:32 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'telegraf'
Oct 10 05:46:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:32.484+0000 7f268f1ee140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Oct 10 05:46:32 np0005479823 ceph-mgr[75218]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Oct 10 05:46:32 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'telemetry'
Oct 10 05:46:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:32.638+0000 7f268f1ee140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Oct 10 05:46:32 np0005479823 ceph-mgr[75218]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Oct 10 05:46:32 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'test_orchestrator'
Oct 10 05:46:32 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.13 scrub starts
Oct 10 05:46:32 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.13 scrub ok
Oct 10 05:46:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:32.860+0000 7f268f1ee140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Oct 10 05:46:32 np0005479823 ceph-mgr[75218]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Oct 10 05:46:32 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'volumes'
Oct 10 05:46:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:33.111+0000 7f268f1ee140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Oct 10 05:46:33 np0005479823 ceph-mgr[75218]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Oct 10 05:46:33 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'zabbix'
Oct 10 05:46:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:46:33.183+0000 7f268f1ee140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Oct 10 05:46:33 np0005479823 ceph-mgr[75218]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Oct 10 05:46:33 np0005479823 ceph-mgr[75218]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Oct 10 05:46:33 np0005479823 ceph-mgr[75218]: mgr load Constructed class from module: dashboard
Oct 10 05:46:33 np0005479823 ceph-mgr[75218]: [dashboard INFO root] server: ssl=no host=192.168.122.102 port=8443
Oct 10 05:46:33 np0005479823 ceph-mgr[75218]: [dashboard INFO root] Configured CherryPy, starting engine...
Oct 10 05:46:33 np0005479823 ceph-mgr[75218]: [dashboard INFO root] Starting engine...
Oct 10 05:46:33 np0005479823 ceph-mgr[75218]: ms_deliver_dispatch: unhandled message 0x55e60c85f860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Oct 10 05:46:33 np0005479823 ceph-mgr[75218]: [dashboard INFO root] Engine started...
Oct 10 05:46:33 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e38 e38: 3 total, 3 up, 3 in
Oct 10 05:46:33 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.18 scrub starts
Oct 10 05:46:33 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.18 scrub ok
Oct 10 05:46:33 np0005479823 ceph-mon[74913]: Active manager daemon compute-0.xkdepb restarted
Oct 10 05:46:33 np0005479823 ceph-mon[74913]: Activating manager daemon compute-0.xkdepb
Oct 10 05:46:33 np0005479823 ceph-mon[74913]: Manager daemon compute-0.xkdepb is now available
Oct 10 05:46:33 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.xkdepb/mirror_snapshot_schedule"}]: dispatch
Oct 10 05:46:33 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.xkdepb/trash_purge_schedule"}]: dispatch
Oct 10 05:46:33 np0005479823 systemd[1]: Created slice User Slice of UID 42477.
Oct 10 05:46:33 np0005479823 systemd[1]: Starting User Runtime Directory /run/user/42477...
Oct 10 05:46:33 np0005479823 systemd-logind[796]: New session 35 of user ceph-admin.
Oct 10 05:46:33 np0005479823 systemd[1]: Finished User Runtime Directory /run/user/42477.
Oct 10 05:46:33 np0005479823 systemd[1]: Starting User Manager for UID 42477...
Oct 10 05:46:34 np0005479823 systemd[82041]: Queued start job for default target Main User Target.
Oct 10 05:46:34 np0005479823 systemd[82041]: Created slice User Application Slice.
Oct 10 05:46:34 np0005479823 systemd[82041]: Started Mark boot as successful after the user session has run 2 minutes.
Oct 10 05:46:34 np0005479823 systemd[82041]: Started Daily Cleanup of User's Temporary Directories.
Oct 10 05:46:34 np0005479823 systemd[82041]: Reached target Paths.
Oct 10 05:46:34 np0005479823 systemd[82041]: Reached target Timers.
Oct 10 05:46:34 np0005479823 systemd[82041]: Starting D-Bus User Message Bus Socket...
Oct 10 05:46:34 np0005479823 systemd[82041]: Starting Create User's Volatile Files and Directories...
Oct 10 05:46:34 np0005479823 systemd[82041]: Listening on D-Bus User Message Bus Socket.
Oct 10 05:46:34 np0005479823 systemd[82041]: Reached target Sockets.
Oct 10 05:46:34 np0005479823 systemd[82041]: Finished Create User's Volatile Files and Directories.
Oct 10 05:46:34 np0005479823 systemd[82041]: Reached target Basic System.
Oct 10 05:46:34 np0005479823 systemd[82041]: Reached target Main User Target.
Oct 10 05:46:34 np0005479823 systemd[82041]: Startup finished in 107ms.
Oct 10 05:46:34 np0005479823 systemd[1]: Started User Manager for UID 42477.
Oct 10 05:46:34 np0005479823 systemd[1]: Started Session 35 of User ceph-admin.
Oct 10 05:46:34 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).mds e2 new map
Oct 10 05:46:34 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).mds e2 print_map#012e2#012btime 2025-10-10T09:46:34:511425+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-10-10T09:46:34.511367+0000#012modified#0112025-10-10T09:46:34.511367+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012 #012 
Oct 10 05:46:34 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e39 e39: 3 total, 3 up, 3 in
Oct 10 05:46:34 np0005479823 podman[82175]: 2025-10-10 09:46:34.824726368 +0000 UTC m=+0.065211100 container exec bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 05:46:34 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.10 scrub starts
Oct 10 05:46:34 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e39 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:46:34 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.10 scrub ok
Oct 10 05:46:34 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Oct 10 05:46:34 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Oct 10 05:46:34 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Oct 10 05:46:34 np0005479823 ceph-mon[74913]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Oct 10 05:46:34 np0005479823 ceph-mon[74913]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Oct 10 05:46:34 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Oct 10 05:46:34 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:34 np0005479823 podman[82175]: 2025-10-10 09:46:34.957235098 +0000 UTC m=+0.197719820 container exec_died bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 05:46:35 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.f scrub starts
Oct 10 05:46:35 np0005479823 ceph-mon[74913]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Oct 10 05:46:35 np0005479823 ceph-mon[74913]: [10/Oct/2025:09:46:34] ENGINE Bus STARTING
Oct 10 05:46:35 np0005479823 ceph-mon[74913]: [10/Oct/2025:09:46:34] ENGINE Serving on http://192.168.122.100:8765
Oct 10 05:46:35 np0005479823 ceph-mon[74913]: [10/Oct/2025:09:46:35] ENGINE Serving on https://192.168.122.100:7150
Oct 10 05:46:35 np0005479823 ceph-mon[74913]: [10/Oct/2025:09:46:35] ENGINE Bus STARTED
Oct 10 05:46:35 np0005479823 ceph-mon[74913]: [10/Oct/2025:09:46:35] ENGINE Client ('192.168.122.100', 60804) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Oct 10 05:46:35 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:35 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:35 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:35 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:35 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:35 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:35 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:35 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.f scrub ok
Oct 10 05:46:36 np0005479823 ceph-mon[74913]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Oct 10 05:46:36 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:36 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:36 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Oct 10 05:46:36 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]: dispatch
Oct 10 05:46:36 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:36 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:36 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Oct 10 05:46:36 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.c scrub starts
Oct 10 05:46:36 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.c scrub ok
Oct 10 05:46:37 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e40 e40: 3 total, 3 up, 3 in
Oct 10 05:46:37 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.d scrub starts
Oct 10 05:46:37 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.d scrub ok
Oct 10 05:46:37 np0005479823 ceph-mon[74913]: Adjusting osd_memory_target on compute-2 to 128.0M
Oct 10 05:46:37 np0005479823 ceph-mon[74913]: Unable to set osd_memory_target on compute-2 to 134243532: error parsing value: Value '134243532' is below minimum 939524096
Oct 10 05:46:37 np0005479823 ceph-mon[74913]: Adjusting osd_memory_target on compute-0 to 128.0M
Oct 10 05:46:37 np0005479823 ceph-mon[74913]: Unable to set osd_memory_target on compute-0 to 134240665: error parsing value: Value '134240665' is below minimum 939524096
Oct 10 05:46:37 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:37 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:37 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Oct 10 05:46:37 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 05:46:37 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]': finished
Oct 10 05:46:37 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]: dispatch
Oct 10 05:46:38 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e41 e41: 3 total, 3 up, 3 in
Oct 10 05:46:38 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.5 scrub starts
Oct 10 05:46:38 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.5 scrub ok
Oct 10 05:46:38 np0005479823 ceph-mon[74913]: Adjusting osd_memory_target on compute-1 to 128.0M
Oct 10 05:46:38 np0005479823 ceph-mon[74913]: Unable to set osd_memory_target on compute-1 to 134243532: error parsing value: Value '134243532' is below minimum 939524096
Oct 10 05:46:38 np0005479823 ceph-mon[74913]: Updating compute-0:/etc/ceph/ceph.conf
Oct 10 05:46:38 np0005479823 ceph-mon[74913]: Updating compute-1:/etc/ceph/ceph.conf
Oct 10 05:46:38 np0005479823 ceph-mon[74913]: Updating compute-2:/etc/ceph/ceph.conf
Oct 10 05:46:38 np0005479823 ceph-mon[74913]: Updating compute-2:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 05:46:38 np0005479823 ceph-mon[74913]: Updating compute-0:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 05:46:38 np0005479823 ceph-mon[74913]: Updating compute-1:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 05:46:38 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]': finished
Oct 10 05:46:38 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:38 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:39 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e42 e42: 3 total, 3 up, 3 in
Oct 10 05:46:39 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.b scrub starts
Oct 10 05:46:39 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.b scrub ok
Oct 10 05:46:39 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:46:39 np0005479823 ceph-mon[74913]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Oct 10 05:46:39 np0005479823 ceph-mon[74913]: Saving service nfs.cephfs spec with placement compute-0;compute-1;compute-2
Oct 10 05:46:39 np0005479823 ceph-mon[74913]: Saving service ingress.nfs.cephfs spec with placement compute-0;compute-1;compute-2
Oct 10 05:46:39 np0005479823 ceph-mon[74913]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Oct 10 05:46:39 np0005479823 ceph-mon[74913]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Oct 10 05:46:39 np0005479823 ceph-mon[74913]: Updating compute-2:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring
Oct 10 05:46:39 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:39 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:39 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:40 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.1b scrub starts
Oct 10 05:46:40 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 2.1b scrub ok
Oct 10 05:46:40 np0005479823 ceph-mon[74913]: Updating compute-0:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring
Oct 10 05:46:40 np0005479823 ceph-mon[74913]: Updating compute-1:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring
Oct 10 05:46:40 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:40 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:40 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:40 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:40 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/200213662' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Oct 10 05:46:40 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/200213662' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Oct 10 05:46:41 np0005479823 ceph-mon[74913]: Deploying daemon node-exporter.compute-0 on compute-0
Oct 10 05:46:42 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:42 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:42 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:43 np0005479823 ceph-mon[74913]: Deploying daemon node-exporter.compute-1 on compute-1
Oct 10 05:46:44 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:46:46 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/1088819812' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Oct 10 05:46:46 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:46 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:46 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:46 np0005479823 systemd[1]: Reloading.
Oct 10 05:46:46 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:46:46 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:46:46 np0005479823 systemd[1]: Reloading.
Oct 10 05:46:46 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:46:46 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:46:46 np0005479823 systemd[1]: Starting Ceph node-exporter.compute-2 for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 05:46:47 np0005479823 ceph-mon[74913]: Deploying daemon node-exporter.compute-2 on compute-2
Oct 10 05:46:47 np0005479823 bash[83514]: Trying to pull quay.io/prometheus/node-exporter:v1.7.0...
Oct 10 05:46:47 np0005479823 bash[83514]: Getting image source signatures
Oct 10 05:46:47 np0005479823 bash[83514]: Copying blob sha256:324153f2810a9927fcce320af9e4e291e0b6e805cbdd1f338386c756b9defa24
Oct 10 05:46:47 np0005479823 bash[83514]: Copying blob sha256:2abcce694348cd2c949c0e98a7400ebdfd8341021bcf6b541bc72033ce982510
Oct 10 05:46:47 np0005479823 bash[83514]: Copying blob sha256:455fd88e5221bc1e278ef2d059cd70e4df99a24e5af050ede621534276f6cf9a
Oct 10 05:46:48 np0005479823 bash[83514]: Copying config sha256:72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e
Oct 10 05:46:48 np0005479823 bash[83514]: Writing manifest to image destination
Oct 10 05:46:48 np0005479823 podman[83514]: 2025-10-10 09:46:48.194247924 +0000 UTC m=+1.134300309 container create 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 10 05:46:48 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8feaa84fe6ee4c345b37cd80291573594d2180df8e2bf40f472c980aaaef067b/merged/etc/node-exporter supports timestamps until 2038 (0x7fffffff)
Oct 10 05:46:48 np0005479823 podman[83514]: 2025-10-10 09:46:48.244436694 +0000 UTC m=+1.184489099 container init 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 10 05:46:48 np0005479823 podman[83514]: 2025-10-10 09:46:48.24913861 +0000 UTC m=+1.189190995 container start 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 10 05:46:48 np0005479823 bash[83514]: 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f
Oct 10 05:46:48 np0005479823 podman[83514]: 2025-10-10 09:46:48.181345734 +0000 UTC m=+1.121398139 image pull 72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e quay.io/prometheus/node-exporter:v1.7.0
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.255Z caller=node_exporter.go:192 level=info msg="Starting node_exporter" version="(version=1.7.0, branch=HEAD, revision=7333465abf9efba81876303bb57e6fadb946041b)"
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.255Z caller=node_exporter.go:193 level=info msg="Build context" build_context="(go=go1.21.4, platform=linux/amd64, user=root@35918982f6d8, date=20231112-23:53:35, tags=netgo osusergo static_build)"
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.256Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.256Z caller=diskstats_linux.go:265 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.256Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.256Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=arp
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=bcache
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=bonding
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=btrfs
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=conntrack
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=cpu
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=cpufreq
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=diskstats
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=dmi
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=edac
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=entropy
Oct 10 05:46:48 np0005479823 systemd[1]: Started Ceph node-exporter.compute-2 for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=fibrechannel
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=filefd
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=filesystem
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=hwmon
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=infiniband
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=ipvs
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=loadavg
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=mdadm
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=meminfo
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=netclass
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=netdev
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=netstat
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=nfs
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=nfsd
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=nvme
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=os
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=powersupplyclass
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=pressure
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=rapl
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=schedstat
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=selinux
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=sockstat
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=softnet
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=stat
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=tapestats
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=textfile
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=thermal_zone
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=time
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=udp_queues
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=uname
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=vmstat
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=xfs
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.258Z caller=node_exporter.go:117 level=info collector=zfs
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.259Z caller=tls_config.go:274 level=info msg="Listening on" address=[::]:9100
Oct 10 05:46:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2[83590]: ts=2025-10-10T09:46:48.260Z caller=tls_config.go:277 level=info msg="TLS is disabled." http2=false address=[::]:9100
Oct 10 05:46:49 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:49 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:49 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:49 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:49 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 05:46:49 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:49 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:46:53 np0005479823 podman[83690]: 2025-10-10 09:46:53.584299112 +0000 UTC m=+0.033283365 container create a59525fdc9664eaad2382cfda19ff63158a52ac510ecaa1f9ec16e549bc2abb6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_tharp, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Oct 10 05:46:53 np0005479823 systemd[1]: Started libpod-conmon-a59525fdc9664eaad2382cfda19ff63158a52ac510ecaa1f9ec16e549bc2abb6.scope.
Oct 10 05:46:53 np0005479823 systemd[1]: Started libcrun container.
Oct 10 05:46:53 np0005479823 podman[83690]: 2025-10-10 09:46:53.64726273 +0000 UTC m=+0.096247003 container init a59525fdc9664eaad2382cfda19ff63158a52ac510ecaa1f9ec16e549bc2abb6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_tharp, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1)
Oct 10 05:46:53 np0005479823 podman[83690]: 2025-10-10 09:46:53.65388395 +0000 UTC m=+0.102868213 container start a59525fdc9664eaad2382cfda19ff63158a52ac510ecaa1f9ec16e549bc2abb6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_tharp, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 05:46:53 np0005479823 podman[83690]: 2025-10-10 09:46:53.657026714 +0000 UTC m=+0.106010957 container attach a59525fdc9664eaad2382cfda19ff63158a52ac510ecaa1f9ec16e549bc2abb6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_tharp, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 05:46:53 np0005479823 musing_tharp[83706]: 167 167
Oct 10 05:46:53 np0005479823 podman[83690]: 2025-10-10 09:46:53.659131004 +0000 UTC m=+0.108115257 container died a59525fdc9664eaad2382cfda19ff63158a52ac510ecaa1f9ec16e549bc2abb6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_tharp, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, OSD_FLAVOR=default)
Oct 10 05:46:53 np0005479823 systemd[1]: libpod-a59525fdc9664eaad2382cfda19ff63158a52ac510ecaa1f9ec16e549bc2abb6.scope: Deactivated successfully.
Oct 10 05:46:53 np0005479823 podman[83690]: 2025-10-10 09:46:53.568785117 +0000 UTC m=+0.017769390 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:46:53 np0005479823 systemd[1]: var-lib-containers-storage-overlay-915c4c4a853f005b9b4db784cfbf18718f803b70c72649a21bab246578cd4cef-merged.mount: Deactivated successfully.
Oct 10 05:46:53 np0005479823 podman[83690]: 2025-10-10 09:46:53.693190483 +0000 UTC m=+0.142174736 container remove a59525fdc9664eaad2382cfda19ff63158a52ac510ecaa1f9ec16e549bc2abb6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_tharp, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 05:46:53 np0005479823 systemd[1]: libpod-conmon-a59525fdc9664eaad2382cfda19ff63158a52ac510ecaa1f9ec16e549bc2abb6.scope: Deactivated successfully.
Oct 10 05:46:53 np0005479823 systemd[1]: Reloading.
Oct 10 05:46:53 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:46:53 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:46:54 np0005479823 systemd[1]: Reloading.
Oct 10 05:46:54 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:54 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:54 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.qujzwn", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Oct 10 05:46:54 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.qujzwn", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Oct 10 05:46:54 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:54 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:46:54 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:46:54 np0005479823 systemd[1]: Starting Ceph rgw.rgw.compute-2.qujzwn for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 05:46:54 np0005479823 podman[83847]: 2025-10-10 09:46:54.501133177 +0000 UTC m=+0.036061227 container create 5800067cdbcc263d30e91141fbfd65d1b3e7f5b67048140f597794aacb645a20 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-rgw-rgw-compute-2-qujzwn, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 05:46:54 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54d434abdc246f356a32cc9bc843623f9c05eccf5c99a502203f8a69fbae1c8f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 05:46:54 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54d434abdc246f356a32cc9bc843623f9c05eccf5c99a502203f8a69fbae1c8f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 05:46:54 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54d434abdc246f356a32cc9bc843623f9c05eccf5c99a502203f8a69fbae1c8f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct 10 05:46:54 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54d434abdc246f356a32cc9bc843623f9c05eccf5c99a502203f8a69fbae1c8f/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-2.qujzwn supports timestamps until 2038 (0x7fffffff)
Oct 10 05:46:54 np0005479823 podman[83847]: 2025-10-10 09:46:54.559360128 +0000 UTC m=+0.094288198 container init 5800067cdbcc263d30e91141fbfd65d1b3e7f5b67048140f597794aacb645a20 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-rgw-rgw-compute-2-qujzwn, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 05:46:54 np0005479823 podman[83847]: 2025-10-10 09:46:54.563971011 +0000 UTC m=+0.098899061 container start 5800067cdbcc263d30e91141fbfd65d1b3e7f5b67048140f597794aacb645a20 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-rgw-rgw-compute-2-qujzwn, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Oct 10 05:46:54 np0005479823 bash[83847]: 5800067cdbcc263d30e91141fbfd65d1b3e7f5b67048140f597794aacb645a20
Oct 10 05:46:54 np0005479823 podman[83847]: 2025-10-10 09:46:54.483367537 +0000 UTC m=+0.018295617 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:46:54 np0005479823 systemd[1]: Started Ceph rgw.rgw.compute-2.qujzwn for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 05:46:54 np0005479823 radosgw[83867]: deferred set uid:gid to 167:167 (ceph:ceph)
Oct 10 05:46:54 np0005479823 radosgw[83867]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process radosgw, pid 2
Oct 10 05:46:54 np0005479823 radosgw[83867]: framework: beast
Oct 10 05:46:54 np0005479823 radosgw[83867]: framework conf key: endpoint, val: 192.168.122.102:8082
Oct 10 05:46:54 np0005479823 radosgw[83867]: init_numa not setting numa affinity
Oct 10 05:46:54 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:46:55 np0005479823 ceph-mon[74913]: Deploying daemon rgw.rgw.compute-2.qujzwn on compute-2
Oct 10 05:46:55 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:55 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:55 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:55 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.zajetc", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Oct 10 05:46:55 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.zajetc", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Oct 10 05:46:55 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e43 e43: 3 total, 3 up, 3 in
Oct 10 05:46:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"} v 0)
Oct 10 05:46:55 np0005479823 ceph-mon[74913]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/2866042771' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Oct 10 05:46:56 np0005479823 ceph-mon[74913]: Deploying daemon rgw.rgw.compute-1.zajetc on compute-1
Oct 10 05:46:56 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.102:0/2866042771' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Oct 10 05:46:56 np0005479823 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Oct 10 05:46:56 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e44 e44: 3 total, 3 up, 3 in
Oct 10 05:46:57 np0005479823 ceph-mon[74913]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Oct 10 05:46:57 np0005479823 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-2.qujzwn' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Oct 10 05:46:57 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:57 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:57 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:57 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.myiozw", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Oct 10 05:46:57 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.myiozw", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Oct 10 05:46:57 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:57 np0005479823 ceph-mon[74913]: Deploying daemon rgw.rgw.compute-0.myiozw on compute-0
Oct 10 05:46:57 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e45 e45: 3 total, 3 up, 3 in
Oct 10 05:46:57 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"} v 0)
Oct 10 05:46:57 np0005479823 ceph-mon[74913]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/2659714554' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Oct 10 05:46:58 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.102:0/2659714554' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Oct 10 05:46:58 np0005479823 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Oct 10 05:46:58 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.101:0/877657232' entity='client.rgw.rgw.compute-1.zajetc' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Oct 10 05:46:58 np0005479823 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-1.zajetc' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Oct 10 05:46:58 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e46 e46: 3 total, 3 up, 3 in
Oct 10 05:46:59 np0005479823 podman[84544]: 2025-10-10 09:46:59.188804323 +0000 UTC m=+0.035001022 container create 9f7eb096448b9104e4b4f7e23ad94f309320f0e8f9a1c4324db33fa7e3fabe3b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_perlman, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Oct 10 05:46:59 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e47 e47: 3 total, 3 up, 3 in
Oct 10 05:46:59 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"} v 0)
Oct 10 05:46:59 np0005479823 ceph-mon[74913]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/2659714554' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Oct 10 05:46:59 np0005479823 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-2.qujzwn' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Oct 10 05:46:59 np0005479823 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-1.zajetc' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Oct 10 05:46:59 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:59 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:59 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:59 np0005479823 ceph-mon[74913]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Oct 10 05:46:59 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:59 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:46:59 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.vlgajy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Oct 10 05:46:59 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.vlgajy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Oct 10 05:46:59 np0005479823 ceph-mon[74913]: Deploying daemon mds.cephfs.compute-2.vlgajy on compute-2
Oct 10 05:46:59 np0005479823 systemd[1]: Started libpod-conmon-9f7eb096448b9104e4b4f7e23ad94f309320f0e8f9a1c4324db33fa7e3fabe3b.scope.
Oct 10 05:46:59 np0005479823 systemd[1]: Started libcrun container.
Oct 10 05:46:59 np0005479823 podman[84544]: 2025-10-10 09:46:59.268249007 +0000 UTC m=+0.114445726 container init 9f7eb096448b9104e4b4f7e23ad94f309320f0e8f9a1c4324db33fa7e3fabe3b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_perlman, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct 10 05:46:59 np0005479823 podman[84544]: 2025-10-10 09:46:59.173007618 +0000 UTC m=+0.019204357 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:46:59 np0005479823 podman[84544]: 2025-10-10 09:46:59.273723268 +0000 UTC m=+0.119919977 container start 9f7eb096448b9104e4b4f7e23ad94f309320f0e8f9a1c4324db33fa7e3fabe3b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_perlman, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Oct 10 05:46:59 np0005479823 podman[84544]: 2025-10-10 09:46:59.276346805 +0000 UTC m=+0.122543544 container attach 9f7eb096448b9104e4b4f7e23ad94f309320f0e8f9a1c4324db33fa7e3fabe3b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_perlman, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.build-date=20250325)
Oct 10 05:46:59 np0005479823 strange_perlman[84560]: 167 167
Oct 10 05:46:59 np0005479823 systemd[1]: libpod-9f7eb096448b9104e4b4f7e23ad94f309320f0e8f9a1c4324db33fa7e3fabe3b.scope: Deactivated successfully.
Oct 10 05:46:59 np0005479823 podman[84544]: 2025-10-10 09:46:59.279366485 +0000 UTC m=+0.125563204 container died 9f7eb096448b9104e4b4f7e23ad94f309320f0e8f9a1c4324db33fa7e3fabe3b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_perlman, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325)
Oct 10 05:46:59 np0005479823 systemd[1]: var-lib-containers-storage-overlay-ea4e5282219a9c3ad20e5648cbd3f6dabaf721bc8bd7c3e59a27979325aa1cae-merged.mount: Deactivated successfully.
Oct 10 05:46:59 np0005479823 podman[84544]: 2025-10-10 09:46:59.317046025 +0000 UTC m=+0.163242734 container remove 9f7eb096448b9104e4b4f7e23ad94f309320f0e8f9a1c4324db33fa7e3fabe3b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_perlman, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 05:46:59 np0005479823 systemd[1]: libpod-conmon-9f7eb096448b9104e4b4f7e23ad94f309320f0e8f9a1c4324db33fa7e3fabe3b.scope: Deactivated successfully.
Oct 10 05:46:59 np0005479823 systemd[1]: Reloading.
Oct 10 05:46:59 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:46:59 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:46:59 np0005479823 systemd[1]: Reloading.
Oct 10 05:46:59 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:46:59 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:46:59 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e47 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:46:59 np0005479823 systemd[1]: Starting Ceph mds.cephfs.compute-2.vlgajy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 05:47:00 np0005479823 podman[84704]: 2025-10-10 09:47:00.069028813 +0000 UTC m=+0.038848229 container create ae094c500c98e27a9d77505176172d5dcddf180ecdc3a15416df9589a6fb1109 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mds-cephfs-compute-2-vlgajy, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Oct 10 05:47:00 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4b0ebf5d25c155ec5522c8f977ff660b395ef0bd052c28d0c4fa42a5f174a5d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 05:47:00 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4b0ebf5d25c155ec5522c8f977ff660b395ef0bd052c28d0c4fa42a5f174a5d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 05:47:00 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4b0ebf5d25c155ec5522c8f977ff660b395ef0bd052c28d0c4fa42a5f174a5d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct 10 05:47:00 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4b0ebf5d25c155ec5522c8f977ff660b395ef0bd052c28d0c4fa42a5f174a5d/merged/var/lib/ceph/mds/ceph-cephfs.compute-2.vlgajy supports timestamps until 2038 (0x7fffffff)
Oct 10 05:47:00 np0005479823 podman[84704]: 2025-10-10 09:47:00.133377157 +0000 UTC m=+0.103196563 container init ae094c500c98e27a9d77505176172d5dcddf180ecdc3a15416df9589a6fb1109 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mds-cephfs-compute-2-vlgajy, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 10 05:47:00 np0005479823 podman[84704]: 2025-10-10 09:47:00.13888457 +0000 UTC m=+0.108703986 container start ae094c500c98e27a9d77505176172d5dcddf180ecdc3a15416df9589a6fb1109 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mds-cephfs-compute-2-vlgajy, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 05:47:00 np0005479823 bash[84704]: ae094c500c98e27a9d77505176172d5dcddf180ecdc3a15416df9589a6fb1109
Oct 10 05:47:00 np0005479823 podman[84704]: 2025-10-10 09:47:00.049234777 +0000 UTC m=+0.019054203 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:47:00 np0005479823 systemd[1]: Started Ceph mds.cephfs.compute-2.vlgajy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 05:47:00 np0005479823 ceph-mds[84723]: set uid:gid to 167:167 (ceph:ceph)
Oct 10 05:47:00 np0005479823 ceph-mds[84723]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mds, pid 2
Oct 10 05:47:00 np0005479823 ceph-mds[84723]: main not setting numa affinity
Oct 10 05:47:00 np0005479823 ceph-mds[84723]: pidfile_write: ignore empty --pid-file
Oct 10 05:47:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mds-cephfs-compute-2-vlgajy[84719]: starting mds.cephfs.compute-2.vlgajy at 
Oct 10 05:47:00 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy Updating MDS map to version 2 from mon.1
Oct 10 05:47:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e48 e48: 3 total, 3 up, 3 in
Oct 10 05:47:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).mds e3 new map
Oct 10 05:47:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).mds e3 print_map#012e3#012btime 2025-10-10T09:47:00:211513+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-10-10T09:46:34.511367+0000#012modified#0112025-10-10T09:46:34.511367+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-2.vlgajy{-1:24337} state up:standby seq 1 addr [v2:192.168.122.102:6804/1744510053,v1:192.168.122.102:6805/1744510053] compat {c=[1],r=[1],i=[1fff]}]
Oct 10 05:47:00 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.102:0/2659714554' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Oct 10 05:47:00 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.101:0/877657232' entity='client.rgw.rgw.compute-1.zajetc' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Oct 10 05:47:00 np0005479823 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Oct 10 05:47:00 np0005479823 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-1.zajetc' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Oct 10 05:47:00 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/2946038047' entity='client.rgw.rgw.compute-0.myiozw' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Oct 10 05:47:00 np0005479823 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-2.qujzwn' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Oct 10 05:47:00 np0005479823 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-1.zajetc' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Oct 10 05:47:00 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/2946038047' entity='client.rgw.rgw.compute-0.myiozw' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Oct 10 05:47:00 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy Updating MDS map to version 3 from mon.1
Oct 10 05:47:00 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy Monitors have assigned me to become a standby
Oct 10 05:47:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).mds e4 new map
Oct 10 05:47:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).mds e4 print_map#012e4#012btime 2025-10-10T09:47:00:244509+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0114#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-10-10T09:46:34.511367+0000#012modified#0112025-10-10T09:47:00.244232+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24337}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 0 members: #012[mds.cephfs.compute-2.vlgajy{0:24337} state up:creating seq 1 addr [v2:192.168.122.102:6804/1744510053,v1:192.168.122.102:6805/1744510053] compat {c=[1],r=[1],i=[1fff]}]#012 #012 
Oct 10 05:47:00 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy Updating MDS map to version 4 from mon.1
Oct 10 05:47:00 np0005479823 ceph-mds[84723]: mds.0.4 handle_mds_map I am now mds.0.4
Oct 10 05:47:00 np0005479823 ceph-mds[84723]: mds.0.4 handle_mds_map state change up:standby --> up:creating
Oct 10 05:47:00 np0005479823 ceph-mds[84723]: mds.0.cache creating system inode with ino:0x1
Oct 10 05:47:00 np0005479823 ceph-mds[84723]: mds.0.cache creating system inode with ino:0x100
Oct 10 05:47:00 np0005479823 ceph-mds[84723]: mds.0.cache creating system inode with ino:0x600
Oct 10 05:47:00 np0005479823 ceph-mds[84723]: mds.0.cache creating system inode with ino:0x601
Oct 10 05:47:00 np0005479823 ceph-mds[84723]: mds.0.cache creating system inode with ino:0x602
Oct 10 05:47:00 np0005479823 ceph-mds[84723]: mds.0.cache creating system inode with ino:0x603
Oct 10 05:47:00 np0005479823 ceph-mds[84723]: mds.0.cache creating system inode with ino:0x604
Oct 10 05:47:00 np0005479823 ceph-mds[84723]: mds.0.cache creating system inode with ino:0x605
Oct 10 05:47:00 np0005479823 ceph-mds[84723]: mds.0.cache creating system inode with ino:0x606
Oct 10 05:47:00 np0005479823 ceph-mds[84723]: mds.0.cache creating system inode with ino:0x607
Oct 10 05:47:00 np0005479823 ceph-mds[84723]: mds.0.cache creating system inode with ino:0x608
Oct 10 05:47:00 np0005479823 ceph-mds[84723]: mds.0.cache creating system inode with ino:0x609
Oct 10 05:47:00 np0005479823 ceph-mds[84723]: mds.0.4 creating_done
Oct 10 05:47:01 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e49 e49: 3 total, 3 up, 3 in
Oct 10 05:47:01 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"} v 0)
Oct 10 05:47:01 np0005479823 ceph-mon[74913]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/2659714554' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Oct 10 05:47:01 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:01 np0005479823 ceph-mon[74913]: daemon mds.cephfs.compute-2.vlgajy assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Oct 10 05:47:01 np0005479823 ceph-mon[74913]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Oct 10 05:47:01 np0005479823 ceph-mon[74913]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Oct 10 05:47:01 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:01 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:01 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.cchwlo", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Oct 10 05:47:01 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.cchwlo", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Oct 10 05:47:01 np0005479823 ceph-mon[74913]: daemon mds.cephfs.compute-2.vlgajy is now active in filesystem cephfs as rank 0
Oct 10 05:47:01 np0005479823 ceph-mon[74913]: Deploying daemon mds.cephfs.compute-0.cchwlo on compute-0
Oct 10 05:47:01 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).mds e5 new map
Oct 10 05:47:01 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).mds e5 print_map#012e5#012btime 2025-10-10T09:47:01:287113+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-10-10T09:46:34.511367+0000#012modified#0112025-10-10T09:47:01.287110+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24337}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 24337 members: 24337#012[mds.cephfs.compute-2.vlgajy{0:24337} state up:active seq 2 addr [v2:192.168.122.102:6804/1744510053,v1:192.168.122.102:6805/1744510053] compat {c=[1],r=[1],i=[1fff]}]#012 #012 
Oct 10 05:47:01 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy Updating MDS map to version 5 from mon.1
Oct 10 05:47:01 np0005479823 ceph-mds[84723]: mds.0.4 handle_mds_map I am now mds.0.4
Oct 10 05:47:01 np0005479823 ceph-mds[84723]: mds.0.4 handle_mds_map state change up:creating --> up:active
Oct 10 05:47:01 np0005479823 ceph-mds[84723]: mds.0.4 recovery_done -- successful recovery!
Oct 10 05:47:01 np0005479823 ceph-mds[84723]: mds.0.4 active_start
Oct 10 05:47:02 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e50 e50: 3 total, 3 up, 3 in
Oct 10 05:47:02 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"} v 0)
Oct 10 05:47:02 np0005479823 ceph-mon[74913]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/2659714554' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Oct 10 05:47:02 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.102:0/2659714554' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Oct 10 05:47:02 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.101:0/877657232' entity='client.rgw.rgw.compute-1.zajetc' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Oct 10 05:47:02 np0005479823 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Oct 10 05:47:02 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/2946038047' entity='client.rgw.rgw.compute-0.myiozw' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Oct 10 05:47:02 np0005479823 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-1.zajetc' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Oct 10 05:47:02 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:02 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:02 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:02 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.fhagzt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Oct 10 05:47:02 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.fhagzt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Oct 10 05:47:02 np0005479823 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-2.qujzwn' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Oct 10 05:47:02 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/2946038047' entity='client.rgw.rgw.compute-0.myiozw' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Oct 10 05:47:02 np0005479823 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-1.zajetc' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Oct 10 05:47:02 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/2946038047' entity='client.rgw.rgw.compute-0.myiozw' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Oct 10 05:47:02 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).mds e6 new map
Oct 10 05:47:02 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).mds e6 print_map#012e6#012btime 2025-10-10T09:47:02:297566+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-10-10T09:46:34.511367+0000#012modified#0112025-10-10T09:47:01.287110+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24337}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012qdb_cluster#011leader: 24337 members: 24337#012[mds.cephfs.compute-2.vlgajy{0:24337} state up:active seq 2 addr [v2:192.168.122.102:6804/1744510053,v1:192.168.122.102:6805/1744510053] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.cchwlo{-1:14592} state up:standby seq 1 addr [v2:192.168.122.100:6806/327772839,v1:192.168.122.100:6807/327772839] compat {c=[1],r=[1],i=[1fff]}]
Oct 10 05:47:02 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).mds e7 new map
Oct 10 05:47:02 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).mds e7 print_map#012e7#012btime 2025-10-10T09:47:02:322797+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-10-10T09:46:34.511367+0000#012modified#0112025-10-10T09:47:01.287110+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24337}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24337 members: 24337#012[mds.cephfs.compute-2.vlgajy{0:24337} state up:active seq 2 addr [v2:192.168.122.102:6804/1744510053,v1:192.168.122.102:6805/1744510053] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.cchwlo{-1:14592} state up:standby seq 1 addr [v2:192.168.122.100:6806/327772839,v1:192.168.122.100:6807/327772839] compat {c=[1],r=[1],i=[1fff]}]
Oct 10 05:47:03 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e51 e51: 3 total, 3 up, 3 in
Oct 10 05:47:03 np0005479823 ceph-mon[74913]: Deploying daemon mds.cephfs.compute-1.fhagzt on compute-1
Oct 10 05:47:03 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.102:0/2659714554' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Oct 10 05:47:03 np0005479823 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-2.qujzwn' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Oct 10 05:47:03 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.101:0/877657232' entity='client.rgw.rgw.compute-1.zajetc' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Oct 10 05:47:03 np0005479823 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-1.zajetc' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Oct 10 05:47:03 np0005479823 ceph-mon[74913]: from='client.? 192.168.122.100:0/2946038047' entity='client.rgw.rgw.compute-0.myiozw' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Oct 10 05:47:03 np0005479823 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-2.qujzwn' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Oct 10 05:47:03 np0005479823 ceph-mon[74913]: from='client.? ' entity='client.rgw.rgw.compute-1.zajetc' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Oct 10 05:47:03 np0005479823 radosgw[83867]: v1 topic migration: starting v1 topic migration..
Oct 10 05:47:03 np0005479823 radosgw[83867]: LDAP not started since no server URIs were provided in the configuration.
Oct 10 05:47:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-rgw-rgw-compute-2-qujzwn[83863]: 2025-10-10T09:47:03.484+0000 7f3ea7b78980 -1 LDAP not started since no server URIs were provided in the configuration.
Oct 10 05:47:03 np0005479823 radosgw[83867]: v1 topic migration: finished v1 topic migration
Oct 10 05:47:03 np0005479823 radosgw[83867]: framework: beast
Oct 10 05:47:03 np0005479823 radosgw[83867]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Oct 10 05:47:03 np0005479823 radosgw[83867]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Oct 10 05:47:03 np0005479823 radosgw[83867]: starting handler: beast
Oct 10 05:47:03 np0005479823 radosgw[83867]: set uid:gid to 167:167 (ceph:ceph)
Oct 10 05:47:03 np0005479823 radosgw[83867]: mgrc service_daemon_register rgw.24304 metadata {arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.102:8082,frontend_type#0=beast,hostname=compute-2,id=rgw.compute-2.qujzwn,kernel_description=#1 SMP PREEMPT_DYNAMIC Tue Sep 30 07:37:35 UTC 2025,kernel_version=5.14.0-621.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864356,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=ac475a20-bf0e-4531-bd8b-a44afde7c93f,zone_name=default,zonegroup_id=8929b431-04ce-48e1-bb4a-cedab812d97d,zonegroup_name=default}
Oct 10 05:47:03 np0005479823 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Oct 10 05:47:03 np0005479823 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Oct 10 05:47:03 np0005479823 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Oct 10 05:47:03 np0005479823 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Oct 10 05:47:03 np0005479823 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Oct 10 05:47:04 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).mds e8 new map
Oct 10 05:47:04 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).mds e8 print_map#012e8#012btime 2025-10-10T09:47:04:615775+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-10-10T09:46:34.511367+0000#012modified#0112025-10-10T09:47:04.295946+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24337}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24337 members: 24337#012[mds.cephfs.compute-2.vlgajy{0:24337} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/1744510053,v1:192.168.122.102:6805/1744510053] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.cchwlo{-1:14592} state up:standby seq 1 addr [v2:192.168.122.100:6806/327772839,v1:192.168.122.100:6807/327772839] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.fhagzt{-1:24206} state up:standby seq 1 addr [v2:192.168.122.101:6804/1757766640,v1:192.168.122.101:6805/1757766640] compat {c=[1],r=[1],i=[1fff]}]
Oct 10 05:47:04 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy Updating MDS map to version 8 from mon.1
Oct 10 05:47:04 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:04 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:04 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:04 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:04 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:04 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:04 np0005479823 ceph-mon[74913]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Oct 10 05:47:04 np0005479823 ceph-mon[74913]: Cluster is now healthy
Oct 10 05:47:04 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:04 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.mssvzx", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Oct 10 05:47:04 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.mssvzx", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Oct 10 05:47:04 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Oct 10 05:47:04 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Oct 10 05:47:04 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Oct 10 05:47:04 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Oct 10 05:47:04 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.mssvzx-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Oct 10 05:47:04 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.mssvzx-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Oct 10 05:47:04 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:47:05 np0005479823 ceph-mds[84723]: mds.pinger is_rank_lagging: rank=0 was never sent ping request.
Oct 10 05:47:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mds-cephfs-compute-2-vlgajy[84719]: 2025-10-10T09:47:05.250+0000 7efc8f0e1640 -1 mds.pinger is_rank_lagging: rank=0 was never sent ping request.
Oct 10 05:47:05 np0005479823 ceph-mon[74913]: Creating key for client.nfs.cephfs.0.0.compute-1.mssvzx
Oct 10 05:47:05 np0005479823 ceph-mon[74913]: Ensuring nfs.cephfs.0 is in the ganesha grace table
Oct 10 05:47:05 np0005479823 ceph-mon[74913]: Rados config object exists: conf-nfs.cephfs
Oct 10 05:47:05 np0005479823 ceph-mon[74913]: Creating key for client.nfs.cephfs.0.0.compute-1.mssvzx-rgw
Oct 10 05:47:05 np0005479823 ceph-mon[74913]: Bind address in nfs.cephfs.0.0.compute-1.mssvzx's ganesha conf is defaulting to empty
Oct 10 05:47:05 np0005479823 ceph-mon[74913]: Deploying daemon nfs.cephfs.0.0.compute-1.mssvzx on compute-1
Oct 10 05:47:06 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).mds e9 new map
Oct 10 05:47:06 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).mds e9 print_map#012e9#012btime 2025-10-10T09:47:06:672904+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-10-10T09:46:34.511367+0000#012modified#0112025-10-10T09:47:04.295946+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24337}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24337 members: 24337#012[mds.cephfs.compute-2.vlgajy{0:24337} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/1744510053,v1:192.168.122.102:6805/1744510053] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.cchwlo{-1:14592} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/327772839,v1:192.168.122.100:6807/327772839] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.fhagzt{-1:24206} state up:standby seq 1 addr [v2:192.168.122.101:6804/1757766640,v1:192.168.122.101:6805/1757766640] compat {c=[1],r=[1],i=[1fff]}]
Oct 10 05:47:07 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:07 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:07 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:07 np0005479823 ceph-mon[74913]: Creating key for client.nfs.cephfs.1.0.compute-2.boccfy
Oct 10 05:47:07 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.boccfy", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Oct 10 05:47:07 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.boccfy", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Oct 10 05:47:07 np0005479823 ceph-mon[74913]: Ensuring nfs.cephfs.1 is in the ganesha grace table
Oct 10 05:47:07 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Oct 10 05:47:07 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Oct 10 05:47:08 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).mds e10 new map
Oct 10 05:47:08 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).mds e10 print_map#012e10#012btime 2025-10-10T09:47:08:789045+0000#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0118#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-10-10T09:46:34.511367+0000#012modified#0112025-10-10T09:47:04.295946+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=24337}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 24337 members: 24337#012[mds.cephfs.compute-2.vlgajy{0:24337} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/1744510053,v1:192.168.122.102:6805/1744510053] compat {c=[1],r=[1],i=[1fff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.cchwlo{-1:14592} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/327772839,v1:192.168.122.100:6807/327772839] compat {c=[1],r=[1],i=[1fff]}]#012[mds.cephfs.compute-1.fhagzt{-1:24206} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/1757766640,v1:192.168.122.101:6805/1757766640] compat {c=[1],r=[1],i=[1fff]}]
Oct 10 05:47:08 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:09 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:47:10 np0005479823 podman[84880]: 2025-10-10 09:47:10.526056472 +0000 UTC m=+0.036730760 container create 40d5a503773f091bd4b185b05f20c070ba36d03086abec9bf1db9e23dbf195f0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_hellman, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 10 05:47:10 np0005479823 systemd[1]: Started libpod-conmon-40d5a503773f091bd4b185b05f20c070ba36d03086abec9bf1db9e23dbf195f0.scope.
Oct 10 05:47:10 np0005479823 systemd[1]: Started libcrun container.
Oct 10 05:47:10 np0005479823 podman[84880]: 2025-10-10 09:47:10.598148022 +0000 UTC m=+0.108822350 container init 40d5a503773f091bd4b185b05f20c070ba36d03086abec9bf1db9e23dbf195f0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_hellman, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 05:47:10 np0005479823 podman[84880]: 2025-10-10 09:47:10.604983679 +0000 UTC m=+0.115657987 container start 40d5a503773f091bd4b185b05f20c070ba36d03086abec9bf1db9e23dbf195f0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_hellman, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2)
Oct 10 05:47:10 np0005479823 podman[84880]: 2025-10-10 09:47:10.511020833 +0000 UTC m=+0.021695151 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:47:10 np0005479823 podman[84880]: 2025-10-10 09:47:10.60830112 +0000 UTC m=+0.118975438 container attach 40d5a503773f091bd4b185b05f20c070ba36d03086abec9bf1db9e23dbf195f0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_hellman, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 05:47:10 np0005479823 magical_hellman[84896]: 167 167
Oct 10 05:47:10 np0005479823 systemd[1]: libpod-40d5a503773f091bd4b185b05f20c070ba36d03086abec9bf1db9e23dbf195f0.scope: Deactivated successfully.
Oct 10 05:47:10 np0005479823 podman[84880]: 2025-10-10 09:47:10.609288882 +0000 UTC m=+0.119963180 container died 40d5a503773f091bd4b185b05f20c070ba36d03086abec9bf1db9e23dbf195f0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_hellman, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 05:47:10 np0005479823 systemd[1]: var-lib-containers-storage-overlay-ba91f52bfc249c2abaa9b2534619c7879ccc57ac6cf15fd7ef0a318f8659aad6-merged.mount: Deactivated successfully.
Oct 10 05:47:10 np0005479823 podman[84880]: 2025-10-10 09:47:10.652407742 +0000 UTC m=+0.163082060 container remove 40d5a503773f091bd4b185b05f20c070ba36d03086abec9bf1db9e23dbf195f0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_hellman, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Oct 10 05:47:10 np0005479823 systemd[1]: libpod-conmon-40d5a503773f091bd4b185b05f20c070ba36d03086abec9bf1db9e23dbf195f0.scope: Deactivated successfully.
Oct 10 05:47:10 np0005479823 systemd[1]: Reloading.
Oct 10 05:47:10 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:47:10 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:47:10 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Oct 10 05:47:10 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Oct 10 05:47:10 np0005479823 ceph-mon[74913]: Rados config object exists: conf-nfs.cephfs
Oct 10 05:47:10 np0005479823 ceph-mon[74913]: Creating key for client.nfs.cephfs.1.0.compute-2.boccfy-rgw
Oct 10 05:47:10 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.boccfy-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Oct 10 05:47:10 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.boccfy-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Oct 10 05:47:10 np0005479823 ceph-mon[74913]: Bind address in nfs.cephfs.1.0.compute-2.boccfy's ganesha conf is defaulting to empty
Oct 10 05:47:10 np0005479823 ceph-mon[74913]: Deploying daemon nfs.cephfs.1.0.compute-2.boccfy on compute-2
Oct 10 05:47:10 np0005479823 systemd[1]: Reloading.
Oct 10 05:47:11 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:47:11 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:47:11 np0005479823 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 05:47:11 np0005479823 podman[85035]: 2025-10-10 09:47:11.535652613 +0000 UTC m=+0.065229334 container create c0c699e75157b12e12bd70966bb74a53cb85c47327c540646fc648e6218a8292 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct 10 05:47:11 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/65c18ffc3984bb82f7acc157cc3b25e9b8553569bbeae84a5fa3da5f5bd939d9/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Oct 10 05:47:11 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/65c18ffc3984bb82f7acc157cc3b25e9b8553569bbeae84a5fa3da5f5bd939d9/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 05:47:11 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/65c18ffc3984bb82f7acc157cc3b25e9b8553569bbeae84a5fa3da5f5bd939d9/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 05:47:11 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/65c18ffc3984bb82f7acc157cc3b25e9b8553569bbeae84a5fa3da5f5bd939d9/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.boccfy-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 05:47:11 np0005479823 podman[85035]: 2025-10-10 09:47:11.513531199 +0000 UTC m=+0.043107940 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:47:11 np0005479823 podman[85035]: 2025-10-10 09:47:11.612552893 +0000 UTC m=+0.142129584 container init c0c699e75157b12e12bd70966bb74a53cb85c47327c540646fc648e6218a8292 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Oct 10 05:47:11 np0005479823 podman[85035]: 2025-10-10 09:47:11.617176086 +0000 UTC m=+0.146752767 container start c0c699e75157b12e12bd70966bb74a53cb85c47327c540646fc648e6218a8292 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 05:47:11 np0005479823 bash[85035]: c0c699e75157b12e12bd70966bb74a53cb85c47327c540646fc648e6218a8292
Oct 10 05:47:11 np0005479823 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000002:nfs.cephfs.1: -2
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Oct 10 05:47:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 05:47:12 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:12 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:12 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:12 np0005479823 ceph-mon[74913]: Creating key for client.nfs.cephfs.2.0.compute-0.ruydzo
Oct 10 05:47:12 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.ruydzo", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Oct 10 05:47:12 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.ruydzo", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Oct 10 05:47:12 np0005479823 ceph-mon[74913]: Ensuring nfs.cephfs.2 is in the ganesha grace table
Oct 10 05:47:12 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Oct 10 05:47:12 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Oct 10 05:47:12 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Oct 10 05:47:12 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Oct 10 05:47:12 np0005479823 ceph-mon[74913]: Rados config object exists: conf-nfs.cephfs
Oct 10 05:47:12 np0005479823 ceph-mon[74913]: Creating key for client.nfs.cephfs.2.0.compute-0.ruydzo-rgw
Oct 10 05:47:12 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.ruydzo-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Oct 10 05:47:12 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.ruydzo-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Oct 10 05:47:12 np0005479823 ceph-mon[74913]: Bind address in nfs.cephfs.2.0.compute-0.ruydzo's ganesha conf is defaulting to empty
Oct 10 05:47:12 np0005479823 ceph-mon[74913]: Deploying daemon nfs.cephfs.2.0.compute-0.ruydzo on compute-0
Oct 10 05:47:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:13 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 05:47:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:13 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 05:47:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:13 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 05:47:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:13 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 05:47:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:13 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 05:47:14 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:14 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:14 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:14 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:14 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:14 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:14 np0005479823 ceph-mon[74913]: Deploying daemon haproxy.nfs.cephfs.compute-1.ehhoyw on compute-1
Oct 10 05:47:14 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:47:18 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:19 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8968000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:19 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:47:19 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:19 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:19 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:19 np0005479823 ceph-mon[74913]: Deploying daemon haproxy.nfs.cephfs.compute-0.gptveb on compute-0
Oct 10 05:47:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:21 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:23 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:23 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:24 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:24 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:24 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:24 np0005479823 ceph-mon[74913]: Deploying daemon haproxy.nfs.cephfs.compute-2.eokdol on compute-2
Oct 10 05:47:24 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:47:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:25 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948000fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:25 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:26 np0005479823 podman[85197]: 2025-10-10 09:47:26.249075285 +0000 UTC m=+2.497893669 container create 30dfa2818bcae8337a19ee41f1e8a3aae98a11c352f0c3257cc7ee75aeaa249d (image=quay.io/ceph/haproxy:2.3, name=jolly_babbage)
Oct 10 05:47:26 np0005479823 podman[85197]: 2025-10-10 09:47:26.230611922 +0000 UTC m=+2.479430326 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Oct 10 05:47:26 np0005479823 systemd[1]: Started libpod-conmon-30dfa2818bcae8337a19ee41f1e8a3aae98a11c352f0c3257cc7ee75aeaa249d.scope.
Oct 10 05:47:26 np0005479823 systemd[1]: Started libcrun container.
Oct 10 05:47:26 np0005479823 podman[85197]: 2025-10-10 09:47:26.327110783 +0000 UTC m=+2.575929197 container init 30dfa2818bcae8337a19ee41f1e8a3aae98a11c352f0c3257cc7ee75aeaa249d (image=quay.io/ceph/haproxy:2.3, name=jolly_babbage)
Oct 10 05:47:26 np0005479823 podman[85197]: 2025-10-10 09:47:26.332090378 +0000 UTC m=+2.580908762 container start 30dfa2818bcae8337a19ee41f1e8a3aae98a11c352f0c3257cc7ee75aeaa249d (image=quay.io/ceph/haproxy:2.3, name=jolly_babbage)
Oct 10 05:47:26 np0005479823 podman[85197]: 2025-10-10 09:47:26.335375057 +0000 UTC m=+2.584193441 container attach 30dfa2818bcae8337a19ee41f1e8a3aae98a11c352f0c3257cc7ee75aeaa249d (image=quay.io/ceph/haproxy:2.3, name=jolly_babbage)
Oct 10 05:47:26 np0005479823 jolly_babbage[85315]: 0 0
Oct 10 05:47:26 np0005479823 podman[85197]: 2025-10-10 09:47:26.338070796 +0000 UTC m=+2.586889190 container died 30dfa2818bcae8337a19ee41f1e8a3aae98a11c352f0c3257cc7ee75aeaa249d (image=quay.io/ceph/haproxy:2.3, name=jolly_babbage)
Oct 10 05:47:26 np0005479823 systemd[1]: libpod-30dfa2818bcae8337a19ee41f1e8a3aae98a11c352f0c3257cc7ee75aeaa249d.scope: Deactivated successfully.
Oct 10 05:47:26 np0005479823 systemd[1]: var-lib-containers-storage-overlay-efd1d3a6dde83fb0e26726be44a6f416d3dc461412ded846f2b3f411afda69e4-merged.mount: Deactivated successfully.
Oct 10 05:47:26 np0005479823 podman[85197]: 2025-10-10 09:47:26.379170019 +0000 UTC m=+2.627988403 container remove 30dfa2818bcae8337a19ee41f1e8a3aae98a11c352f0c3257cc7ee75aeaa249d (image=quay.io/ceph/haproxy:2.3, name=jolly_babbage)
Oct 10 05:47:26 np0005479823 systemd[1]: libpod-conmon-30dfa2818bcae8337a19ee41f1e8a3aae98a11c352f0c3257cc7ee75aeaa249d.scope: Deactivated successfully.
Oct 10 05:47:26 np0005479823 systemd[1]: Reloading.
Oct 10 05:47:26 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:47:26 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:47:26 np0005479823 systemd[1]: Reloading.
Oct 10 05:47:26 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:47:26 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:47:27 np0005479823 systemd[1]: Starting Ceph haproxy.nfs.cephfs.compute-2.eokdol for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 05:47:27 np0005479823 podman[85459]: 2025-10-10 09:47:27.33932642 +0000 UTC m=+0.041931721 container create 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0 (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol)
Oct 10 05:47:27 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/23862060fcbe060db6cccf6bb2d67f0a22678bd92e5ebb08f9d9555dbfd63796/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Oct 10 05:47:27 np0005479823 podman[85459]: 2025-10-10 09:47:27.411207015 +0000 UTC m=+0.113812416 container init 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0 (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol)
Oct 10 05:47:27 np0005479823 podman[85459]: 2025-10-10 09:47:27.323778945 +0000 UTC m=+0.026384266 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Oct 10 05:47:27 np0005479823 podman[85459]: 2025-10-10 09:47:27.42616994 +0000 UTC m=+0.128775281 container start 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0 (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol)
Oct 10 05:47:27 np0005479823 bash[85459]: 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0
Oct 10 05:47:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [NOTICE] 282/094727 (2) : New worker #1 (4) forked
Oct 10 05:47:27 np0005479823 systemd[1]: Started Ceph haproxy.nfs.cephfs.compute-2.eokdol for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 05:47:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:27 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89440016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:27 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:28 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:28.417863) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089648418082, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 6519, "num_deletes": 256, "total_data_size": 17899569, "memory_usage": 19298368, "flush_reason": "Manual Compaction"}
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089648465917, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 11386863, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 6524, "table_properties": {"data_size": 11362444, "index_size": 15281, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8069, "raw_key_size": 78427, "raw_average_key_size": 24, "raw_value_size": 11300778, "raw_average_value_size": 3506, "num_data_blocks": 678, "num_entries": 3223, "num_filter_entries": 3223, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 1760089519, "file_creation_time": 1760089648, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 48147 microseconds, and 22076 cpu microseconds.
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:28.466004) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 11386863 bytes OK
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:28.466050) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:28.469497) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:28.469520) EVENT_LOG_v1 {"time_micros": 1760089648469513, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:28.469548) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 17864492, prev total WAL file size 17864492, number of live WAL files 2.
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:28.476038) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323532' seq:0, type:0; will stop at (end)
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(10MB) 8(1648B)]
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089648476202, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 11388511, "oldest_snapshot_seqno": -1}
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: Deploying daemon keepalived.nfs.cephfs.compute-2.fcbgvm on compute-2
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 2970 keys, 11383085 bytes, temperature: kUnknown
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089648562681, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 11383085, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11359283, "index_size": 15245, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 7493, "raw_key_size": 74964, "raw_average_key_size": 25, "raw_value_size": 11300836, "raw_average_value_size": 3804, "num_data_blocks": 676, "num_entries": 2970, "num_filter_entries": 2970, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760089648, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:28.563169) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 11383085 bytes
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:28.564734) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 131.4 rd, 131.4 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(10.9, 0.0 +0.0 blob) out(10.9 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 3228, records dropped: 258 output_compression: NoCompression
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:28.564784) EVENT_LOG_v1 {"time_micros": 1760089648564752, "job": 4, "event": "compaction_finished", "compaction_time_micros": 86659, "compaction_time_cpu_micros": 44766, "output_level": 6, "num_output_files": 1, "total_output_size": 11383085, "num_input_records": 3228, "num_output_records": 2970, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089648569218, "job": 4, "event": "table_file_deletion", "file_number": 14}
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089648569326, "job": 4, "event": "table_file_deletion", "file_number": 8}
Oct 10 05:47:28 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:28.475810) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 05:47:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:29 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:29 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:47:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:29 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89440016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:30 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:31 np0005479823 podman[85580]: 2025-10-10 09:47:31.24875015 +0000 UTC m=+3.138721491 container create cae100c2e8df9845474a421adcd28c177a74790dadf1bcd7f39f0e0b0cca9eaf (image=quay.io/ceph/keepalived:2.2.4, name=sad_yalow, com.redhat.component=keepalived-container, io.openshift.expose-services=, architecture=x86_64, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, distribution-scope=public, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, version=2.2.4, name=keepalived, release=1793, summary=Provides keepalived on RHEL 9 for Ceph.)
Oct 10 05:47:31 np0005479823 podman[85580]: 2025-10-10 09:47:31.234080223 +0000 UTC m=+3.124051564 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Oct 10 05:47:31 np0005479823 systemd[1]: Started libpod-conmon-cae100c2e8df9845474a421adcd28c177a74790dadf1bcd7f39f0e0b0cca9eaf.scope.
Oct 10 05:47:31 np0005479823 systemd[1]: Started libcrun container.
Oct 10 05:47:31 np0005479823 podman[85580]: 2025-10-10 09:47:31.309628379 +0000 UTC m=+3.199599750 container init cae100c2e8df9845474a421adcd28c177a74790dadf1bcd7f39f0e0b0cca9eaf (image=quay.io/ceph/keepalived:2.2.4, name=sad_yalow, io.buildah.version=1.28.2, architecture=x86_64, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, distribution-scope=public, build-date=2023-02-22T09:23:20, version=2.2.4, vcs-type=git, com.redhat.component=keepalived-container, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc.)
Oct 10 05:47:31 np0005479823 podman[85580]: 2025-10-10 09:47:31.317586572 +0000 UTC m=+3.207557913 container start cae100c2e8df9845474a421adcd28c177a74790dadf1bcd7f39f0e0b0cca9eaf (image=quay.io/ceph/keepalived:2.2.4, name=sad_yalow, release=1793, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, io.openshift.expose-services=, architecture=x86_64, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, distribution-scope=public, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, vcs-type=git, com.redhat.component=keepalived-container)
Oct 10 05:47:31 np0005479823 podman[85580]: 2025-10-10 09:47:31.320666725 +0000 UTC m=+3.210638116 container attach cae100c2e8df9845474a421adcd28c177a74790dadf1bcd7f39f0e0b0cca9eaf (image=quay.io/ceph/keepalived:2.2.4, name=sad_yalow, vcs-type=git, io.openshift.expose-services=, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, release=1793, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., name=keepalived, description=keepalived for Ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph.)
Oct 10 05:47:31 np0005479823 sad_yalow[85679]: 0 0
Oct 10 05:47:31 np0005479823 systemd[1]: libpod-cae100c2e8df9845474a421adcd28c177a74790dadf1bcd7f39f0e0b0cca9eaf.scope: Deactivated successfully.
Oct 10 05:47:31 np0005479823 conmon[85679]: conmon cae100c2e8df9845474a <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-cae100c2e8df9845474a421adcd28c177a74790dadf1bcd7f39f0e0b0cca9eaf.scope/container/memory.events
Oct 10 05:47:31 np0005479823 podman[85580]: 2025-10-10 09:47:31.325737403 +0000 UTC m=+3.215708744 container died cae100c2e8df9845474a421adcd28c177a74790dadf1bcd7f39f0e0b0cca9eaf (image=quay.io/ceph/keepalived:2.2.4, name=sad_yalow, release=1793, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, io.buildah.version=1.28.2, architecture=x86_64, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, distribution-scope=public, io.openshift.expose-services=, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, vcs-type=git, com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc.)
Oct 10 05:47:31 np0005479823 systemd[1]: var-lib-containers-storage-overlay-5670ca1ecf11d55c7732da222050b203d0f5568cb8345c4225ef46ef2a78289c-merged.mount: Deactivated successfully.
Oct 10 05:47:31 np0005479823 podman[85580]: 2025-10-10 09:47:31.365690558 +0000 UTC m=+3.255661899 container remove cae100c2e8df9845474a421adcd28c177a74790dadf1bcd7f39f0e0b0cca9eaf (image=quay.io/ceph/keepalived:2.2.4, name=sad_yalow, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., distribution-scope=public, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, architecture=x86_64, com.redhat.component=keepalived-container, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, version=2.2.4)
Oct 10 05:47:31 np0005479823 systemd[1]: libpod-conmon-cae100c2e8df9845474a421adcd28c177a74790dadf1bcd7f39f0e0b0cca9eaf.scope: Deactivated successfully.
Oct 10 05:47:31 np0005479823 systemd[1]: Reloading.
Oct 10 05:47:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:31 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:31 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:47:31 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:47:31 np0005479823 systemd[1]: Reloading.
Oct 10 05:47:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:31 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:31 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:47:31 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:47:32 np0005479823 systemd[1]: Starting Ceph keepalived.nfs.cephfs.compute-2.fcbgvm for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 05:47:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:32 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89440016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:32 np0005479823 podman[85824]: 2025-10-10 09:47:32.370524911 +0000 UTC m=+0.043128430 container create 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm, vcs-type=git, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., architecture=x86_64, release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, io.openshift.expose-services=)
Oct 10 05:47:32 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/09511f2789405164399a7808eeeb44e6de8e40c54e9fca8640fe041738223d0f/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 05:47:32 np0005479823 podman[85824]: 2025-10-10 09:47:32.43714988 +0000 UTC m=+0.109753429 container init 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, version=2.2.4, architecture=x86_64, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, name=keepalived, io.buildah.version=1.28.2, release=1793, summary=Provides keepalived on RHEL 9 for Ceph.)
Oct 10 05:47:32 np0005479823 podman[85824]: 2025-10-10 09:47:32.442361734 +0000 UTC m=+0.114965273 container start 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, release=1793, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, name=keepalived, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, architecture=x86_64, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct 10 05:47:32 np0005479823 podman[85824]: 2025-10-10 09:47:32.354183749 +0000 UTC m=+0.026787288 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Oct 10 05:47:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:47:32 2025: Starting Keepalived v2.2.4 (08/21,2021)
Oct 10 05:47:32 np0005479823 bash[85824]: 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c
Oct 10 05:47:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:47:32 2025: Running on Linux 5.14.0-621.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Tue Sep 30 07:37:35 UTC 2025 (built for Linux 5.14.0)
Oct 10 05:47:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:47:32 2025: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Oct 10 05:47:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:47:32 2025: Configuration file /etc/keepalived/keepalived.conf
Oct 10 05:47:32 np0005479823 systemd[1]: Started Ceph keepalived.nfs.cephfs.compute-2.fcbgvm for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 05:47:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:47:32 2025: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Oct 10 05:47:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:47:32 2025: Starting VRRP child process, pid=4
Oct 10 05:47:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:47:32 2025: Startup complete
Oct 10 05:47:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:47:32 2025: (VI_0) Entering BACKUP STATE (init)
Oct 10 05:47:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:47:32 2025: VRRP_Script(check_backend) succeeded
Oct 10 05:47:32 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:32 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:32 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:33 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:33 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e52 e52: 3 total, 3 up, 3 in
Oct 10 05:47:33 np0005479823 ceph-mon[74913]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Oct 10 05:47:33 np0005479823 ceph-mon[74913]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Oct 10 05:47:33 np0005479823 ceph-mon[74913]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Oct 10 05:47:33 np0005479823 ceph-mon[74913]: Deploying daemon keepalived.nfs.cephfs.compute-1.twbftp on compute-1
Oct 10 05:47:33 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Oct 10 05:47:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:33 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:34 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:34 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e53 e53: 3 total, 3 up, 3 in
Oct 10 05:47:34 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Oct 10 05:47:34 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]: dispatch
Oct 10 05:47:34 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:47:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:35 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e54 e54: 3 total, 3 up, 3 in
Oct 10 05:47:35 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]': finished
Oct 10 05:47:35 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Oct 10 05:47:35 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct 10 05:47:35 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct 10 05:47:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:35 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:36 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:47:36 2025: (VI_0) Entering MASTER STATE
Oct 10 05:47:36 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e55 e55: 3 total, 3 up, 3 in
Oct 10 05:47:36 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Oct 10 05:47:36 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]': finished
Oct 10 05:47:36 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Oct 10 05:47:36 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Oct 10 05:47:36 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Oct 10 05:47:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:37 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:37 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e56 e56: 3 total, 3 up, 3 in
Oct 10 05:47:37 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Oct 10 05:47:37 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:37 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:37 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:37 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct 10 05:47:37 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct 10 05:47:37 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Oct 10 05:47:37 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Oct 10 05:47:37 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Oct 10 05:47:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:37 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:38 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:38 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e57 e57: 3 total, 3 up, 3 in
Oct 10 05:47:38 np0005479823 ceph-mon[74913]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Oct 10 05:47:38 np0005479823 ceph-mon[74913]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Oct 10 05:47:38 np0005479823 ceph-mon[74913]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Oct 10 05:47:38 np0005479823 ceph-mon[74913]: Deploying daemon keepalived.nfs.cephfs.compute-0.mciijj on compute-0
Oct 10 05:47:38 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Oct 10 05:47:38 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Oct 10 05:47:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:39 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:39 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e58 e58: 3 total, 3 up, 3 in
Oct 10 05:47:39 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct 10 05:47:39 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Oct 10 05:47:39 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e58 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:47:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:39 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:40 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e59 e59: 3 total, 3 up, 3 in
Oct 10 05:47:40 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Oct 10 05:47:40 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Oct 10 05:47:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:47:40 2025: (VI_0) Received advert from 192.168.122.101 with lower priority 90, ours 90, forcing new election
Oct 10 05:47:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:41 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c002b10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:41 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e60 e60: 3 total, 3 up, 3 in
Oct 10 05:47:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:41 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:42 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:43 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:43 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:43 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:43 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:43 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:43 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:44 np0005479823 ceph-mon[74913]: Deploying daemon alertmanager.compute-0 on compute-0
Oct 10 05:47:44 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:44 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:44 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:47:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:45 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:47:45 2025: (VI_0) Master received advert from 192.168.122.100 with higher priority 100, ours 90
Oct 10 05:47:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:47:45 2025: (VI_0) Entering BACKUP STATE
Oct 10 05:47:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:45 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:46 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e61 e61: 3 total, 3 up, 3 in
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[9.16( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[9.7( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[11.17( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.6( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.15( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[11.16( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.16( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[9.17( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[11.13( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.11( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.2( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[9.3( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.3( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[9.9( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.9( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[9.b( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[9.8( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.a( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[11.e( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.d( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[11.a( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.c( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[11.8( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.b( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[11.3( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[9.5( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:46 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c003c10 fd 37 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.5( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.f( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[9.18( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.1f( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[11.19( empty local-lis/les=0/0 n=0 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[9.1d( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[8.1c( empty local-lis/les=0/0 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[9.13( empty local-lis/les=0/0 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.17( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[12.11( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[12.13( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.15( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.13( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct 10 05:47:46 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct 10 05:47:46 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct 10 05:47:46 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Oct 10 05:47:46 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct 10 05:47:46 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:46 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:46 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:46 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:46 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:46 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:46 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.1( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[12.7( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[12.4( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[12.9( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.f( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.d( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[7.5( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.b( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.9( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.3( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[12.2( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.5( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[12.3( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[7.14( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.19( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[12.1a( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[7.11( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.1d( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[12.18( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.1f( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[7.1d( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[7.a( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.7( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[12.1e( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[12.1d( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.1b( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[12.17( empty local-lis/les=0/0 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[10.11( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[7.16( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 61 pg[7.1f( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:47 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e62 e62: 3 total, 3 up, 3 in
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.1( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.1( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.17( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.17( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.f( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.f( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.1f( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.1f( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.13( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.15( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.13( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.15( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.19( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.1b( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.1b( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.9( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.9( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.19( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.7( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.7( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.b( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.b( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.1d( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.d( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.1d( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.d( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.11( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.11( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.3( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.5( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.3( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[10.5( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=62) [2]/[1] r=-1 lpr=62 pi=[56,62)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[12.13( empty local-lis/les=61/62 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[12.9( v 60'1 lc 0'0 (0'0,60'1] local-lis/les=61/62 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=60'1 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[12.7( empty local-lis/les=61/62 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[12.4( empty local-lis/les=61/62 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.c( v 51'44 (0'0,51'44] local-lis/les=61/62 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[12.11( empty local-lis/les=61/62 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[7.14( empty local-lis/les=61/62 n=0 ec=54/22 lis/c=54/54 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[7.16( empty local-lis/les=61/62 n=0 ec=54/22 lis/c=54/54 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[12.1d( empty local-lis/les=61/62 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[9.18( v 44'6 (0'0,44'6] local-lis/les=61/62 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=44'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[9.5( v 44'6 (0'0,44'6] local-lis/les=61/62 n=1 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=44'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[11.19( v 48'48 (0'0,48'48] local-lis/les=61/62 n=0 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[7.a( empty local-lis/les=61/62 n=0 ec=54/22 lis/c=54/54 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[9.9( v 44'6 (0'0,44'6] local-lis/les=61/62 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=44'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.5( v 51'44 (0'0,51'44] local-lis/les=61/62 n=1 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[12.2( empty local-lis/les=61/62 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[7.5( empty local-lis/les=61/62 n=0 ec=54/22 lis/c=54/54 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[9.7( v 44'6 (0'0,44'6] local-lis/les=61/62 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=44'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.6( v 51'44 (0'0,51'44] local-lis/les=61/62 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.b( v 51'44 (0'0,51'44] local-lis/les=61/62 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[11.e( v 60'57 lc 60'56 (0'0,60'57] local-lis/les=61/62 n=0 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=60'57 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[11.3( v 60'57 lc 60'56 (0'0,60'57] local-lis/les=61/62 n=1 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=60'57 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[12.1e( empty local-lis/les=61/62 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[11.8( v 48'48 (0'0,48'48] local-lis/les=61/62 n=1 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.d( v 51'44 lc 51'19 (0'0,51'44] local-lis/les=61/62 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.15( v 51'44 (0'0,51'44] local-lis/les=61/62 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.a( v 51'44 (0'0,51'44] local-lis/les=61/62 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[9.16( v 44'6 (0'0,44'6] local-lis/les=61/62 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=44'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[9.b( v 44'6 (0'0,44'6] local-lis/les=61/62 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=44'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[7.1f( empty local-lis/les=61/62 n=0 ec=54/22 lis/c=54/54 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.3( v 51'44 (0'0,51'44] local-lis/les=61/62 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[11.13( v 48'48 (0'0,48'48] local-lis/les=61/62 n=0 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.11( v 51'44 (0'0,51'44] local-lis/les=61/62 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[11.16( v 48'48 (0'0,48'48] local-lis/les=61/62 n=0 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[11.a( v 48'48 (0'0,48'48] local-lis/les=61/62 n=0 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.2( v 51'44 (0'0,51'44] local-lis/les=61/62 n=1 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[9.3( v 44'6 (0'0,44'6] local-lis/les=61/62 n=1 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=44'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.9( v 51'44 (0'0,51'44] local-lis/les=61/62 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[9.8( v 44'6 (0'0,44'6] local-lis/les=61/62 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=44'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[9.17( v 44'6 (0'0,44'6] local-lis/les=61/62 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=44'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.16( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=61/62 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.1f( v 51'44 (0'0,51'44] local-lis/les=61/62 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.1c( v 51'44 (0'0,51'44] local-lis/les=61/62 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[12.1a( empty local-lis/les=61/62 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[9.1d( v 44'6 (0'0,44'6] local-lis/les=61/62 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=44'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[12.18( v 60'1 lc 0'0 (0'0,60'1] local-lis/les=61/62 n=1 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=60'1 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[12.17( empty local-lis/les=61/62 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[7.11( empty local-lis/les=61/62 n=0 ec=54/22 lis/c=54/54 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[8.f( v 51'44 (0'0,51'44] local-lis/les=61/62 n=0 ec=54/40 lis/c=54/54 les/c/f=55/55/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[7.1d( empty local-lis/les=61/62 n=0 ec=54/22 lis/c=54/54 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[54,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[11.17( v 48'48 (0'0,48'48] local-lis/les=61/62 n=0 ec=58/47 lis/c=58/58 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[12.3( empty local-lis/les=61/62 n=0 ec=58/49 lis/c=58/58 les/c/f=59/59/0 sis=61) [2] r=0 lpr=61 pi=[58,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 62 pg[9.13( v 44'6 (0'0,44'6] local-lis/les=61/62 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=61) [2] r=0 lpr=61 pi=[56,61)/1 crt=44'6 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:47 np0005479823 ceph-mon[74913]: Regenerating cephadm self-signed grafana TLS certificates
Oct 10 05:47:47 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:47 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]': finished
Oct 10 05:47:47 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Oct 10 05:47:47 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Oct 10 05:47:47 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Oct 10 05:47:47 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Oct 10 05:47:47 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Oct 10 05:47:47 np0005479823 ceph-mon[74913]: Deploying daemon grafana.compute-0 on compute-0
Oct 10 05:47:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:47 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948002f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:47 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.13 deep-scrub starts
Oct 10 05:47:47 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.13 deep-scrub ok
Oct 10 05:47:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:48 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:48 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e63 e63: 3 total, 3 up, 3 in
Oct 10 05:47:48 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Oct 10 05:47:48 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Oct 10 05:47:48 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e64 e64: 3 total, 3 up, 3 in
Oct 10 05:47:48 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 64 pg[10.17( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=64) [2] r=0 lpr=64 pi=[56,64)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:48 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 64 pg[10.17( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=64) [2] r=0 lpr=64 pi=[56,64)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:48 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 64 pg[10.1( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=64) [2] r=0 lpr=64 pi=[56,64)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:48 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 64 pg[10.1( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=64) [2] r=0 lpr=64 pi=[56,64)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:49 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:49 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e65 e65: 3 total, 3 up, 3 in
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.19( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.19( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.7( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.7( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.1b( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.1b( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.9( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.9( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.f( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.f( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.15( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.15( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.13( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.b( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.13( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.b( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.d( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.1d( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.d( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.1d( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.11( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.11( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.5( v 64'1098 (0'0,64'1098] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 luod=0'0 crt=59'1094 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.5( v 64'1098 (0'0,64'1098] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=59'1094 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.3( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.3( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:49 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.17( v 51'1091 (0'0,51'1091] local-lis/les=64/65 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=64) [2] r=0 lpr=64 pi=[56,64)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 65 pg[10.1( v 51'1091 (0'0,51'1091] local-lis/les=64/65 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=64) [2] r=0 lpr=64 pi=[56,64)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:49 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e65 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:47:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:49 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.17 scrub starts
Oct 10 05:47:49 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.17 scrub ok
Oct 10 05:47:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:50 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e66 e66: 3 total, 3 up, 3 in
Oct 10 05:47:50 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 66 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:50 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 66 pg[10.7( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:50 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 66 pg[10.19( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:50 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 66 pg[10.1b( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:50 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 66 pg[10.9( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:50 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 66 pg[10.f( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:50 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 66 pg[10.13( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:50 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 66 pg[10.b( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:50 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 66 pg[10.d( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:50 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 66 pg[10.1d( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:50 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 66 pg[10.11( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:50 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 66 pg[10.3( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:50 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 66 pg[10.5( v 64'1098 (0'0,64'1098] local-lis/les=65/66 n=6 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=64'1098 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:50 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 66 pg[10.15( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=62/56 les/c/f=63/57/0 sis=65) [2] r=0 lpr=65 pi=[56,65)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:47:50 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.1 deep-scrub starts
Oct 10 05:47:50 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.1 deep-scrub ok
Oct 10 05:47:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:51 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:51 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e67 e67: 3 total, 3 up, 3 in
Oct 10 05:47:51 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.1f scrub starts
Oct 10 05:47:51 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.1f scrub ok
Oct 10 05:47:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:51 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:52 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:52 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.7 scrub starts
Oct 10 05:47:52 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.7 scrub ok
Oct 10 05:47:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:53 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:53 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.1b scrub starts
Oct 10 05:47:53 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.1b scrub ok
Oct 10 05:47:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:53 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8968000df0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:54 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8938000d00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:54 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.7 deep-scrub starts
Oct 10 05:47:54 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:54 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:54 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:54 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:54 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:54 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:54 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.7 deep-scrub ok
Oct 10 05:47:54 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e67 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:47:54 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Oct 10 05:47:54 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:54.931797) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 05:47:54 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Oct 10 05:47:54 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089674931865, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 1078, "num_deletes": 251, "total_data_size": 1850155, "memory_usage": 1888160, "flush_reason": "Manual Compaction"}
Oct 10 05:47:54 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Oct 10 05:47:54 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089674945966, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 1187808, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6529, "largest_seqno": 7602, "table_properties": {"data_size": 1182537, "index_size": 2603, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 13502, "raw_average_key_size": 21, "raw_value_size": 1171020, "raw_average_value_size": 1847, "num_data_blocks": 115, "num_entries": 634, "num_filter_entries": 634, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089648, "oldest_key_time": 1760089648, "file_creation_time": 1760089674, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Oct 10 05:47:54 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 14225 microseconds, and 8607 cpu microseconds.
Oct 10 05:47:54 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 05:47:54 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:54.946017) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 1187808 bytes OK
Oct 10 05:47:54 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:54.946038) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Oct 10 05:47:54 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:54.947184) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Oct 10 05:47:54 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:54.947201) EVENT_LOG_v1 {"time_micros": 1760089674947195, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 05:47:54 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:54.947221) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 05:47:54 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 1844398, prev total WAL file size 1844398, number of live WAL files 2.
Oct 10 05:47:54 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 05:47:54 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:54.947761) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Oct 10 05:47:54 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 05:47:54 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(1159KB)], [15(10MB)]
Oct 10 05:47:54 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089674947792, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 12570893, "oldest_snapshot_seqno": -1}
Oct 10 05:47:55 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3077 keys, 11334814 bytes, temperature: kUnknown
Oct 10 05:47:55 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089675013149, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 11334814, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11310343, "index_size": 15658, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 7749, "raw_key_size": 79218, "raw_average_key_size": 25, "raw_value_size": 11249777, "raw_average_value_size": 3656, "num_data_blocks": 685, "num_entries": 3077, "num_filter_entries": 3077, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760089674, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Oct 10 05:47:55 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 05:47:55 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:55.013397) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 11334814 bytes
Oct 10 05:47:55 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:55.014901) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 192.1 rd, 173.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 10.9 +0.0 blob) out(10.8 +0.0 blob), read-write-amplify(20.1) write-amplify(9.5) OK, records in: 3604, records dropped: 527 output_compression: NoCompression
Oct 10 05:47:55 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:55.014924) EVENT_LOG_v1 {"time_micros": 1760089675014914, "job": 6, "event": "compaction_finished", "compaction_time_micros": 65433, "compaction_time_cpu_micros": 21720, "output_level": 6, "num_output_files": 1, "total_output_size": 11334814, "num_input_records": 3604, "num_output_records": 3077, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 05:47:55 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 05:47:55 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089675015386, "job": 6, "event": "table_file_deletion", "file_number": 17}
Oct 10 05:47:55 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 05:47:55 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089675017929, "job": 6, "event": "table_file_deletion", "file_number": 15}
Oct 10 05:47:55 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:54.947695) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 05:47:55 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:55.017977) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 05:47:55 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:55.017981) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 05:47:55 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:55.017982) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 05:47:55 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:55.017984) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 05:47:55 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:47:55.017985) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 05:47:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:55 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:55 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.4 scrub starts
Oct 10 05:47:55 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.4 scrub ok
Oct 10 05:47:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:55 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:56 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8968000df0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:56 np0005479823 ceph-mon[74913]: Deploying daemon haproxy.rgw.default.compute-0.ofnenu on compute-0
Oct 10 05:47:56 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Oct 10 05:47:56 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e68 e68: 3 total, 3 up, 3 in
Oct 10 05:47:56 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.c scrub starts
Oct 10 05:47:56 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.c scrub ok
Oct 10 05:47:57 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Oct 10 05:47:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:57 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8938001820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:57 np0005479823 podman[85957]: 2025-10-10 09:47:57.824801574 +0000 UTC m=+0.042896108 container create ed19d1f64bd1adade0c272adce29e9e164fcc1e3134fde344b0f5561319c7755 (image=quay.io/ceph/haproxy:2.3, name=gracious_wiles)
Oct 10 05:47:57 np0005479823 systemd[1]: Started libpod-conmon-ed19d1f64bd1adade0c272adce29e9e164fcc1e3134fde344b0f5561319c7755.scope.
Oct 10 05:47:57 np0005479823 systemd[1]: Started libcrun container.
Oct 10 05:47:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:57 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:57 np0005479823 podman[85957]: 2025-10-10 09:47:57.807659868 +0000 UTC m=+0.025754402 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Oct 10 05:47:57 np0005479823 podman[85957]: 2025-10-10 09:47:57.906228078 +0000 UTC m=+0.124322612 container init ed19d1f64bd1adade0c272adce29e9e164fcc1e3134fde344b0f5561319c7755 (image=quay.io/ceph/haproxy:2.3, name=gracious_wiles)
Oct 10 05:47:57 np0005479823 podman[85957]: 2025-10-10 09:47:57.912541849 +0000 UTC m=+0.130636363 container start ed19d1f64bd1adade0c272adce29e9e164fcc1e3134fde344b0f5561319c7755 (image=quay.io/ceph/haproxy:2.3, name=gracious_wiles)
Oct 10 05:47:57 np0005479823 podman[85957]: 2025-10-10 09:47:57.915412991 +0000 UTC m=+0.133507525 container attach ed19d1f64bd1adade0c272adce29e9e164fcc1e3134fde344b0f5561319c7755 (image=quay.io/ceph/haproxy:2.3, name=gracious_wiles)
Oct 10 05:47:57 np0005479823 gracious_wiles[85974]: 0 0
Oct 10 05:47:57 np0005479823 systemd[1]: libpod-ed19d1f64bd1adade0c272adce29e9e164fcc1e3134fde344b0f5561319c7755.scope: Deactivated successfully.
Oct 10 05:47:57 np0005479823 podman[85957]: 2025-10-10 09:47:57.91820473 +0000 UTC m=+0.136299254 container died ed19d1f64bd1adade0c272adce29e9e164fcc1e3134fde344b0f5561319c7755 (image=quay.io/ceph/haproxy:2.3, name=gracious_wiles)
Oct 10 05:47:57 np0005479823 systemd[1]: var-lib-containers-storage-overlay-5552605a6ad55e4d529a53d7f2fcdf2f58cfb82d594f5d0047a2b5e5948df48e-merged.mount: Deactivated successfully.
Oct 10 05:47:57 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.11 scrub starts
Oct 10 05:47:57 np0005479823 podman[85957]: 2025-10-10 09:47:57.95087311 +0000 UTC m=+0.168967624 container remove ed19d1f64bd1adade0c272adce29e9e164fcc1e3134fde344b0f5561319c7755 (image=quay.io/ceph/haproxy:2.3, name=gracious_wiles)
Oct 10 05:47:57 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.11 scrub ok
Oct 10 05:47:57 np0005479823 systemd[1]: libpod-conmon-ed19d1f64bd1adade0c272adce29e9e164fcc1e3134fde344b0f5561319c7755.scope: Deactivated successfully.
Oct 10 05:47:58 np0005479823 systemd[1]: Reloading.
Oct 10 05:47:58 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:47:58 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:47:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:58 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:58 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:58 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:58 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:58 np0005479823 ceph-mon[74913]: Deploying daemon haproxy.rgw.default.compute-2.mhdkdo on compute-2
Oct 10 05:47:58 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Oct 10 05:47:58 np0005479823 systemd[1]: Reloading.
Oct 10 05:47:58 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e69 e69: 3 total, 3 up, 3 in
Oct 10 05:47:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 69 pg[10.14( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=69) [2] r=0 lpr=69 pi=[56,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 69 pg[10.c( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=69) [2] r=0 lpr=69 pi=[56,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 69 pg[10.4( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=69) [2] r=0 lpr=69 pi=[56,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 69 pg[10.1c( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=69) [2] r=0 lpr=69 pi=[56,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:47:58 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:47:58 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:47:58 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e70 e70: 3 total, 3 up, 3 in
Oct 10 05:47:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 70 pg[10.4( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=70) [2]/[1] r=-1 lpr=70 pi=[56,70)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 70 pg[10.4( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=70) [2]/[1] r=-1 lpr=70 pi=[56,70)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 05:47:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 70 pg[10.c( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=70) [2]/[1] r=-1 lpr=70 pi=[56,70)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 70 pg[10.c( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=70) [2]/[1] r=-1 lpr=70 pi=[56,70)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 05:47:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 70 pg[10.14( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=70) [2]/[1] r=-1 lpr=70 pi=[56,70)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 70 pg[10.14( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=70) [2]/[1] r=-1 lpr=70 pi=[56,70)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 05:47:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 70 pg[10.1c( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=70) [2]/[1] r=-1 lpr=70 pi=[56,70)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:47:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 70 pg[10.1c( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=70) [2]/[1] r=-1 lpr=70 pi=[56,70)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 05:47:58 np0005479823 systemd[1]: Starting Ceph haproxy.rgw.default.compute-2.mhdkdo for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 05:47:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:47:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000065s ======
Oct 10 05:47:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:47:58.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000065s
Oct 10 05:47:58 np0005479823 podman[86118]: 2025-10-10 09:47:58.813511421 +0000 UTC m=+0.042283238 container create 368ea6225e5db4fb8f3c793229a798884b40b020716bc0a76ab32ec2cbc8121d (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-rgw-default-compute-2-mhdkdo)
Oct 10 05:47:58 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a7b9a57660298361ca69f9d96249345bb33b06e12a121d055002216180d6c03a/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Oct 10 05:47:58 np0005479823 podman[86118]: 2025-10-10 09:47:58.877776148 +0000 UTC m=+0.106547975 container init 368ea6225e5db4fb8f3c793229a798884b40b020716bc0a76ab32ec2cbc8121d (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-rgw-default-compute-2-mhdkdo)
Oct 10 05:47:58 np0005479823 podman[86118]: 2025-10-10 09:47:58.887060645 +0000 UTC m=+0.115832422 container start 368ea6225e5db4fb8f3c793229a798884b40b020716bc0a76ab32ec2cbc8121d (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-rgw-default-compute-2-mhdkdo)
Oct 10 05:47:58 np0005479823 bash[86118]: 368ea6225e5db4fb8f3c793229a798884b40b020716bc0a76ab32ec2cbc8121d
Oct 10 05:47:58 np0005479823 podman[86118]: 2025-10-10 09:47:58.795653822 +0000 UTC m=+0.024425619 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Oct 10 05:47:58 np0005479823 systemd[1]: Started Ceph haproxy.rgw.default.compute-2.mhdkdo for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 05:47:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-rgw-default-compute-2-mhdkdo[86133]: [NOTICE] 282/094758 (2) : New worker #1 (4) forked
Oct 10 05:47:58 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 7.14 scrub starts
Oct 10 05:47:58 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 7.14 scrub ok
Oct 10 05:47:59 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Oct 10 05:47:59 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:59 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:59 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:59 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:59 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:47:59 np0005479823 ceph-mon[74913]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Oct 10 05:47:59 np0005479823 ceph-mon[74913]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Oct 10 05:47:59 np0005479823 ceph-mon[74913]: Deploying daemon keepalived.rgw.default.compute-2.bbeizy on compute-2
Oct 10 05:47:59 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e71 e71: 3 total, 3 up, 3 in
Oct 10 05:47:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:59 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8968000df0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:59 np0005479823 podman[86238]: 2025-10-10 09:47:59.573145241 +0000 UTC m=+0.052414061 container create b565333fd7652d84803b430ee8c7f6acb24887f9659d6b3ff035cf6d9f289ed6 (image=quay.io/ceph/keepalived:2.2.4, name=thirsty_chatterjee, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, description=keepalived for Ceph, build-date=2023-02-22T09:23:20, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, release=1793, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, vcs-type=git)
Oct 10 05:47:59 np0005479823 systemd[1]: Started libpod-conmon-b565333fd7652d84803b430ee8c7f6acb24887f9659d6b3ff035cf6d9f289ed6.scope.
Oct 10 05:47:59 np0005479823 podman[86238]: 2025-10-10 09:47:59.550110607 +0000 UTC m=+0.029379447 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Oct 10 05:47:59 np0005479823 systemd[1]: Started libcrun container.
Oct 10 05:47:59 np0005479823 podman[86238]: 2025-10-10 09:47:59.671769322 +0000 UTC m=+0.151038152 container init b565333fd7652d84803b430ee8c7f6acb24887f9659d6b3ff035cf6d9f289ed6 (image=quay.io/ceph/keepalived:2.2.4, name=thirsty_chatterjee, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, architecture=x86_64, com.redhat.component=keepalived-container, distribution-scope=public, description=keepalived for Ceph, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, vendor=Red Hat, Inc.)
Oct 10 05:47:59 np0005479823 podman[86238]: 2025-10-10 09:47:59.680865062 +0000 UTC m=+0.160133872 container start b565333fd7652d84803b430ee8c7f6acb24887f9659d6b3ff035cf6d9f289ed6 (image=quay.io/ceph/keepalived:2.2.4, name=thirsty_chatterjee, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., release=1793, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=keepalived for Ceph, io.buildah.version=1.28.2, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.component=keepalived-container, build-date=2023-02-22T09:23:20)
Oct 10 05:47:59 np0005479823 podman[86238]: 2025-10-10 09:47:59.683654181 +0000 UTC m=+0.162922981 container attach b565333fd7652d84803b430ee8c7f6acb24887f9659d6b3ff035cf6d9f289ed6 (image=quay.io/ceph/keepalived:2.2.4, name=thirsty_chatterjee, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, build-date=2023-02-22T09:23:20, vcs-type=git, distribution-scope=public, architecture=x86_64, release=1793, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, vendor=Red Hat, Inc.)
Oct 10 05:47:59 np0005479823 thirsty_chatterjee[86255]: 0 0
Oct 10 05:47:59 np0005479823 systemd[1]: libpod-b565333fd7652d84803b430ee8c7f6acb24887f9659d6b3ff035cf6d9f289ed6.scope: Deactivated successfully.
Oct 10 05:47:59 np0005479823 conmon[86255]: conmon b565333fd7652d84803b <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-b565333fd7652d84803b430ee8c7f6acb24887f9659d6b3ff035cf6d9f289ed6.scope/container/memory.events
Oct 10 05:47:59 np0005479823 podman[86238]: 2025-10-10 09:47:59.690639654 +0000 UTC m=+0.169908464 container died b565333fd7652d84803b430ee8c7f6acb24887f9659d6b3ff035cf6d9f289ed6 (image=quay.io/ceph/keepalived:2.2.4, name=thirsty_chatterjee, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, io.k8s.display-name=Keepalived on RHEL 9, description=keepalived for Ceph, vendor=Red Hat, Inc., architecture=x86_64, name=keepalived, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, vcs-type=git)
Oct 10 05:47:59 np0005479823 systemd[1]: var-lib-containers-storage-overlay-e1938b87bc92c8c02f70d75105dec5219aa891e8473639f341cb105abb51494f-merged.mount: Deactivated successfully.
Oct 10 05:47:59 np0005479823 podman[86238]: 2025-10-10 09:47:59.734452939 +0000 UTC m=+0.213721749 container remove b565333fd7652d84803b430ee8c7f6acb24887f9659d6b3ff035cf6d9f289ed6 (image=quay.io/ceph/keepalived:2.2.4, name=thirsty_chatterjee, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, release=1793, vcs-type=git, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph)
Oct 10 05:47:59 np0005479823 systemd[1]: libpod-conmon-b565333fd7652d84803b430ee8c7f6acb24887f9659d6b3ff035cf6d9f289ed6.scope: Deactivated successfully.
Oct 10 05:47:59 np0005479823 systemd[1]: Reloading.
Oct 10 05:47:59 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e71 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:47:59 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:47:59 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:47:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:47:59 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8938001820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:47:59 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.1d scrub starts
Oct 10 05:47:59 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.1d scrub ok
Oct 10 05:48:00 np0005479823 systemd[1]: Reloading.
Oct 10 05:48:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:00 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:00 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:48:00 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:48:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:48:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:00.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:48:00 np0005479823 systemd[1]: Starting Ceph keepalived.rgw.default.compute-2.bbeizy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 05:48:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e72 e72: 3 total, 3 up, 3 in
Oct 10 05:48:00 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Oct 10 05:48:00 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.4( v 71'1102 (0'0,71'1102] local-lis/les=0/0 n=6 ec=56/45 lis/c=70/56 les/c/f=71/57/0 sis=72) [2] r=0 lpr=72 pi=[56,72)/1 luod=0'0 crt=60'1098 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:00 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.4( v 71'1102 (0'0,71'1102] local-lis/les=0/0 n=6 ec=56/45 lis/c=70/56 les/c/f=71/57/0 sis=72) [2] r=0 lpr=72 pi=[56,72)/1 crt=60'1098 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:48:00 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.c( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=70/56 les/c/f=71/57/0 sis=72) [2] r=0 lpr=72 pi=[56,72)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:00 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.c( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=70/56 les/c/f=71/57/0 sis=72) [2] r=0 lpr=72 pi=[56,72)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:48:00 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.15( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=72 pruub=14.230051041s) [0] r=-1 lpr=72 pi=[65,72)/1 crt=51'1091 mlcod 0'0 active pruub 144.965850830s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:00 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.15( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=72 pruub=14.230030060s) [0] r=-1 lpr=72 pi=[65,72)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 144.965850830s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:00 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.14( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=70/56 les/c/f=71/57/0 sis=72) [2] r=0 lpr=72 pi=[56,72)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:00 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.14( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=70/56 les/c/f=71/57/0 sis=72) [2] r=0 lpr=72 pi=[56,72)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:48:00 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.1d( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=72 pruub=14.229329109s) [0] r=-1 lpr=72 pi=[65,72)/1 crt=51'1091 mlcod 0'0 active pruub 144.965774536s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:00 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.1d( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=72 pruub=14.229254723s) [0] r=-1 lpr=72 pi=[65,72)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 144.965774536s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:00 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.1c( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=70/56 les/c/f=71/57/0 sis=72) [2] r=0 lpr=72 pi=[56,72)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:00 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.1c( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=70/56 les/c/f=71/57/0 sis=72) [2] r=0 lpr=72 pi=[56,72)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:48:00 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.d( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=72 pruub=14.228994370s) [0] r=-1 lpr=72 pi=[65,72)/1 crt=51'1091 mlcod 0'0 active pruub 144.965759277s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:00 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.d( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=72 pruub=14.228788376s) [0] r=-1 lpr=72 pi=[65,72)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 144.965759277s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:00 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.5( v 67'1101 (0'0,67'1101] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=72 pruub=14.228815079s) [0] r=-1 lpr=72 pi=[65,72)/1 crt=66'1099 lcod 66'1100 mlcod 66'1100 active pruub 144.965835571s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:00 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 72 pg[10.5( v 67'1101 (0'0,67'1101] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=72 pruub=14.228641510s) [0] r=-1 lpr=72 pi=[65,72)/1 crt=66'1099 lcod 66'1100 mlcod 0'0 unknown NOTIFY pruub 144.965835571s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:00.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:00 np0005479823 podman[86404]: 2025-10-10 09:48:00.607283545 +0000 UTC m=+0.046825002 container create f8cfe2dfc37a24698160fece75542d6b585efaf1af2bb0ec7b8d11be8a7b8654 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, build-date=2023-02-22T09:23:20, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, name=keepalived, release=1793, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, version=2.2.4, io.openshift.expose-services=)
Oct 10 05:48:00 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/00956075af261ac334734a594a59fe9de93d7e170a73eb744c05481b26897625/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 05:48:00 np0005479823 podman[86404]: 2025-10-10 09:48:00.590415228 +0000 UTC m=+0.029956705 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Oct 10 05:48:00 np0005479823 podman[86404]: 2025-10-10 09:48:00.687289834 +0000 UTC m=+0.126831341 container init f8cfe2dfc37a24698160fece75542d6b585efaf1af2bb0ec7b8d11be8a7b8654 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.28.2, build-date=2023-02-22T09:23:20, version=2.2.4, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=keepalived-container, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, architecture=x86_64, description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, name=keepalived)
Oct 10 05:48:00 np0005479823 podman[86404]: 2025-10-10 09:48:00.697179459 +0000 UTC m=+0.136720936 container start f8cfe2dfc37a24698160fece75542d6b585efaf1af2bb0ec7b8d11be8a7b8654 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, architecture=x86_64, description=keepalived for Ceph, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, release=1793)
Oct 10 05:48:00 np0005479823 bash[86404]: f8cfe2dfc37a24698160fece75542d6b585efaf1af2bb0ec7b8d11be8a7b8654
Oct 10 05:48:00 np0005479823 systemd[1]: Started Ceph keepalived.rgw.default.compute-2.bbeizy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 05:48:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:00 2025: Starting Keepalived v2.2.4 (08/21,2021)
Oct 10 05:48:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:00 2025: Running on Linux 5.14.0-621.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Tue Sep 30 07:37:35 UTC 2025 (built for Linux 5.14.0)
Oct 10 05:48:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:00 2025: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Oct 10 05:48:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:00 2025: Configuration file /etc/keepalived/keepalived.conf
Oct 10 05:48:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:00 2025: Failed to bind to process monitoring socket - errno 98 - Address already in use
Oct 10 05:48:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:00 2025: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Oct 10 05:48:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:00 2025: Starting VRRP child process, pid=4
Oct 10 05:48:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:00 2025: Startup complete
Oct 10 05:48:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:00 2025: (VI_0) Entering BACKUP STATE (init)
Oct 10 05:48:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:00 2025: VRRP_Script(check_backend) succeeded
Oct 10 05:48:01 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.5 scrub starts
Oct 10 05:48:01 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.5 scrub ok
Oct 10 05:48:01 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e73 e73: 3 total, 3 up, 3 in
Oct 10 05:48:01 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 73 pg[10.5( v 67'1101 (0'0,67'1101] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=73) [0]/[2] r=0 lpr=73 pi=[65,73)/1 crt=66'1099 lcod 66'1100 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:01 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 73 pg[10.1d( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=73) [0]/[2] r=0 lpr=73 pi=[65,73)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:01 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 73 pg[10.1d( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=73) [0]/[2] r=0 lpr=73 pi=[65,73)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 05:48:01 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 73 pg[10.5( v 67'1101 (0'0,67'1101] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=73) [0]/[2] r=0 lpr=73 pi=[65,73)/1 crt=66'1099 lcod 66'1100 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 05:48:01 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 73 pg[10.d( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=73) [0]/[2] r=0 lpr=73 pi=[65,73)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:01 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 73 pg[10.15( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=73) [0]/[2] r=0 lpr=73 pi=[65,73)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:01 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 73 pg[10.15( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=73) [0]/[2] r=0 lpr=73 pi=[65,73)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 05:48:01 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 73 pg[10.d( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=73) [0]/[2] r=0 lpr=73 pi=[65,73)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 05:48:01 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Oct 10 05:48:01 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:01 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:01 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:01 np0005479823 ceph-mon[74913]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Oct 10 05:48:01 np0005479823 ceph-mon[74913]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Oct 10 05:48:01 np0005479823 ceph-mon[74913]: Deploying daemon keepalived.rgw.default.compute-0.igkrok on compute-0
Oct 10 05:48:01 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 73 pg[10.14( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=5 ec=56/45 lis/c=70/56 les/c/f=71/57/0 sis=72) [2] r=0 lpr=72 pi=[56,72)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:48:01 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 73 pg[10.4( v 71'1102 (0'0,71'1102] local-lis/les=72/73 n=6 ec=56/45 lis/c=70/56 les/c/f=71/57/0 sis=72) [2] r=0 lpr=72 pi=[56,72)/1 crt=71'1102 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:48:01 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 73 pg[10.c( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=6 ec=56/45 lis/c=70/56 les/c/f=71/57/0 sis=72) [2] r=0 lpr=72 pi=[56,72)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:48:01 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 73 pg[10.1c( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=5 ec=56/45 lis/c=70/56 les/c/f=71/57/0 sis=72) [2] r=0 lpr=72 pi=[56,72)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:48:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:01 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:01 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:01 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8968000df0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:02 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.18 scrub starts
Oct 10 05:48:02 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.18 scrub ok
Oct 10 05:48:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:02 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8938001820 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:02.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:02 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e74 e74: 3 total, 3 up, 3 in
Oct 10 05:48:02 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 74 pg[10.5( v 67'1101 (0'0,67'1101] local-lis/les=73/74 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=73) [0]/[2] async=[0] r=0 lpr=73 pi=[65,73)/1 crt=67'1101 lcod 66'1100 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:48:02 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 74 pg[10.1d( v 51'1091 (0'0,51'1091] local-lis/les=73/74 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=73) [0]/[2] async=[0] r=0 lpr=73 pi=[65,73)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:48:02 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 74 pg[10.15( v 51'1091 (0'0,51'1091] local-lis/les=73/74 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=73) [0]/[2] async=[0] r=0 lpr=73 pi=[65,73)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:48:02 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 74 pg[10.d( v 51'1091 (0'0,51'1091] local-lis/les=73/74 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=73) [0]/[2] async=[0] r=0 lpr=73 pi=[65,73)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:48:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:02.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:02 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:02 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:03 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.19 scrub starts
Oct 10 05:48:03 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.19 scrub ok
Oct 10 05:48:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:03 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:03 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e75 e75: 3 total, 3 up, 3 in
Oct 10 05:48:03 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 75 pg[10.15( v 51'1091 (0'0,51'1091] local-lis/les=73/74 n=5 ec=56/45 lis/c=73/65 les/c/f=74/66/0 sis=75 pruub=15.105600357s) [0] async=[0] r=-1 lpr=75 pi=[65,75)/1 crt=51'1091 mlcod 51'1091 active pruub 148.798385620s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:03 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 75 pg[10.15( v 51'1091 (0'0,51'1091] local-lis/les=73/74 n=5 ec=56/45 lis/c=73/65 les/c/f=74/66/0 sis=75 pruub=15.105497360s) [0] r=-1 lpr=75 pi=[65,75)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 148.798385620s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:03 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 75 pg[10.1d( v 51'1091 (0'0,51'1091] local-lis/les=73/74 n=5 ec=56/45 lis/c=73/65 les/c/f=74/66/0 sis=75 pruub=15.104902267s) [0] async=[0] r=-1 lpr=75 pi=[65,75)/1 crt=51'1091 mlcod 51'1091 active pruub 148.798400879s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:03 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 75 pg[10.1d( v 51'1091 (0'0,51'1091] local-lis/les=73/74 n=5 ec=56/45 lis/c=73/65 les/c/f=74/66/0 sis=75 pruub=15.104844093s) [0] r=-1 lpr=75 pi=[65,75)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 148.798400879s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:03 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 75 pg[10.d( v 51'1091 (0'0,51'1091] local-lis/les=73/74 n=6 ec=56/45 lis/c=73/65 les/c/f=74/66/0 sis=75 pruub=15.104885101s) [0] async=[0] r=-1 lpr=75 pi=[65,75)/1 crt=51'1091 mlcod 51'1091 active pruub 148.798400879s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:03 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 75 pg[10.5( v 74'1104 (0'0,74'1104] local-lis/les=73/74 n=6 ec=56/45 lis/c=73/65 les/c/f=74/66/0 sis=75 pruub=15.100663185s) [0] async=[0] r=-1 lpr=75 pi=[65,75)/1 crt=67'1101 lcod 74'1103 mlcod 74'1103 active pruub 148.794448853s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:03 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 75 pg[10.5( v 74'1104 (0'0,74'1104] local-lis/les=73/74 n=6 ec=56/45 lis/c=73/65 les/c/f=74/66/0 sis=75 pruub=15.100558281s) [0] r=-1 lpr=75 pi=[65,75)/1 crt=67'1101 lcod 74'1103 mlcod 0'0 unknown NOTIFY pruub 148.794448853s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:03 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 75 pg[10.d( v 51'1091 (0'0,51'1091] local-lis/les=73/74 n=6 ec=56/45 lis/c=73/65 les/c/f=74/66/0 sis=75 pruub=15.104282379s) [0] r=-1 lpr=75 pi=[65,75)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 148.798400879s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:03 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:03 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:03 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:03 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:03 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:03 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:04 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.5 scrub starts
Oct 10 05:48:04 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.5 scrub ok
Oct 10 05:48:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:04 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89680091b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:04.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:04 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:04 2025: (VI_0) Entering MASTER STATE
Oct 10 05:48:04 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e76 e76: 3 total, 3 up, 3 in
Oct 10 05:48:04 np0005479823 ceph-mon[74913]: Deploying daemon prometheus.compute-0 on compute-0
Oct 10 05:48:04 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:04.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:04 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e76 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:48:04 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.9 scrub starts
Oct 10 05:48:05 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.9 scrub ok
Oct 10 05:48:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:05 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:05 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89680091b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e77 e77: 3 total, 3 up, 3 in
Oct 10 05:48:05 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Oct 10 05:48:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:05 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:05 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 7.5 scrub starts
Oct 10 05:48:06 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 7.5 scrub ok
Oct 10 05:48:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:06 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:48:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:06.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:48:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:06 2025: (VI_0) Master received advert from 192.168.122.100 with higher priority 100, ours 90
Oct 10 05:48:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:06 2025: (VI_0) Entering BACKUP STATE
Oct 10 05:48:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:06 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:06 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000065s ======
Oct 10 05:48:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:06.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000065s
Oct 10 05:48:06 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Oct 10 05:48:06 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e78 e78: 3 total, 3 up, 3 in
Oct 10 05:48:06 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.2 deep-scrub starts
Oct 10 05:48:06 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.2 deep-scrub ok
Oct 10 05:48:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:07 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:07 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:07 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89680091b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:07 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Oct 10 05:48:07 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e79 e79: 3 total, 3 up, 3 in
Oct 10 05:48:07 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 79 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=79 pruub=15.071848869s) [1] r=-1 lpr=79 pi=[65,79)/1 crt=51'1091 mlcod 0'0 active pruub 152.962234497s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:07 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 79 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=79 pruub=15.071782112s) [1] r=-1 lpr=79 pi=[65,79)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 152.962234497s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:07 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 79 pg[10.f( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=79 pruub=15.074589729s) [1] r=-1 lpr=79 pi=[65,79)/1 crt=51'1091 mlcod 0'0 active pruub 152.965835571s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:07 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 79 pg[10.f( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=79 pruub=15.074548721s) [1] r=-1 lpr=79 pi=[65,79)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 152.965835571s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:07 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 79 pg[10.17( v 51'1091 (0'0,51'1091] local-lis/les=64/65 n=3 ec=56/45 lis/c=64/64 les/c/f=65/65/0 sis=79 pruub=14.043064117s) [1] r=-1 lpr=79 pi=[64,79)/1 crt=51'1091 mlcod 0'0 active pruub 151.934539795s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:07 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 79 pg[10.17( v 51'1091 (0'0,51'1091] local-lis/les=64/65 n=3 ec=56/45 lis/c=64/64 les/c/f=65/65/0 sis=79 pruub=14.042997360s) [1] r=-1 lpr=79 pi=[64,79)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 151.934539795s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:07 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 79 pg[10.7( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=79 pruub=15.074128151s) [1] r=-1 lpr=79 pi=[65,79)/1 crt=51'1091 mlcod 0'0 active pruub 152.965805054s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:07 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 79 pg[10.7( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=79 pruub=15.073760033s) [1] r=-1 lpr=79 pi=[65,79)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 152.965805054s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:07 np0005479823 systemd-logind[796]: New session 37 of user zuul.
Oct 10 05:48:07 np0005479823 systemd[1]: Started Session 37 of User zuul.
Oct 10 05:48:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:07 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89680091b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:07 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.7 deep-scrub starts
Oct 10 05:48:07 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.7 deep-scrub ok
Oct 10 05:48:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:08 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8938002cb0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:08.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:08 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:08 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:08 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e80 e80: 3 total, 3 up, 3 in
Oct 10 05:48:08 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 80 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=80) [1]/[2] r=0 lpr=80 pi=[65,80)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:08 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 80 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=80) [1]/[2] r=0 lpr=80 pi=[65,80)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 05:48:08 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 80 pg[10.7( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=80) [1]/[2] r=0 lpr=80 pi=[65,80)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:08 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 80 pg[10.7( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=80) [1]/[2] r=0 lpr=80 pi=[65,80)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 05:48:08 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 80 pg[10.f( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=80) [1]/[2] r=0 lpr=80 pi=[65,80)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:08 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 80 pg[10.f( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=80) [1]/[2] r=0 lpr=80 pi=[65,80)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 05:48:08 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 80 pg[10.17( v 51'1091 (0'0,51'1091] local-lis/les=64/65 n=3 ec=56/45 lis/c=64/64 les/c/f=65/65/0 sis=80) [1]/[2] r=0 lpr=80 pi=[64,80)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:08 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 80 pg[10.17( v 51'1091 (0'0,51'1091] local-lis/les=64/65 n=3 ec=56/45 lis/c=64/64 les/c/f=65/65/0 sis=80) [1]/[2] r=0 lpr=80 pi=[64,80)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 05:48:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:08.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:08 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Oct 10 05:48:08 np0005479823 python3.9[86587]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:48:08 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.b scrub starts
Oct 10 05:48:08 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.b scrub ok
Oct 10 05:48:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:09 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:09 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:09 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e81 e81: 3 total, 3 up, 3 in
Oct 10 05:48:09 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 81 pg[10.17( v 51'1091 (0'0,51'1091] local-lis/les=80/81 n=3 ec=56/45 lis/c=64/64 les/c/f=65/65/0 sis=80) [1]/[2] async=[1] r=0 lpr=80 pi=[64,80)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:48:09 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 81 pg[10.7( v 51'1091 (0'0,51'1091] local-lis/les=80/81 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=80) [1]/[2] async=[1] r=0 lpr=80 pi=[65,80)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:48:09 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 81 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=80/81 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=80) [1]/[2] async=[1] r=0 lpr=80 pi=[65,80)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:48:09 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 81 pg[10.f( v 51'1091 (0'0,51'1091] local-lis/les=80/81 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=80) [1]/[2] async=[1] r=0 lpr=80 pi=[65,80)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:48:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:09 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8938002cb0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:09 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:09 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:09 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:09 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "mgr module enable", "module": "prometheus"}]: dispatch
Oct 10 05:48:09 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: ignoring --setuser ceph since I am not root
Oct 10 05:48:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: ignoring --setgroup ceph since I am not root
Oct 10 05:48:09 np0005479823 ceph-mgr[75218]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Oct 10 05:48:09 np0005479823 ceph-mgr[75218]: pidfile_write: ignore empty --pid-file
Oct 10 05:48:09 np0005479823 systemd[1]: session-35.scope: Deactivated successfully.
Oct 10 05:48:09 np0005479823 systemd[1]: session-35.scope: Consumed 21.211s CPU time.
Oct 10 05:48:09 np0005479823 systemd-logind[796]: Session 35 logged out. Waiting for processes to exit.
Oct 10 05:48:09 np0005479823 systemd-logind[796]: Removed session 35.
Oct 10 05:48:09 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'alerts'
Oct 10 05:48:09 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:48:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:09 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 14 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:09.935+0000 7f6000674140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Oct 10 05:48:09 np0005479823 ceph-mgr[75218]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Oct 10 05:48:09 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'balancer'
Oct 10 05:48:09 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.1e scrub starts
Oct 10 05:48:09 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.1e scrub ok
Oct 10 05:48:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:10.013+0000 7f6000674140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Oct 10 05:48:10 np0005479823 ceph-mgr[75218]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Oct 10 05:48:10 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'cephadm'
Oct 10 05:48:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:10 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 14 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:10.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:10 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:10 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e82 e82: 3 total, 3 up, 3 in
Oct 10 05:48:10 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 82 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=80/81 n=5 ec=56/45 lis/c=80/65 les/c/f=81/66/0 sis=82 pruub=14.981945038s) [1] async=[1] r=-1 lpr=82 pi=[65,82)/1 crt=51'1091 mlcod 51'1091 active pruub 155.713287354s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:10 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 82 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=80/81 n=5 ec=56/45 lis/c=80/65 les/c/f=81/66/0 sis=82 pruub=14.981849670s) [1] r=-1 lpr=82 pi=[65,82)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 155.713287354s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:10 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 82 pg[10.7( v 51'1091 (0'0,51'1091] local-lis/les=80/81 n=5 ec=56/45 lis/c=80/65 les/c/f=81/66/0 sis=82 pruub=14.981137276s) [1] async=[1] r=-1 lpr=82 pi=[65,82)/1 crt=51'1091 mlcod 51'1091 active pruub 155.713256836s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:10 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 82 pg[10.7( v 51'1091 (0'0,51'1091] local-lis/les=80/81 n=5 ec=56/45 lis/c=80/65 les/c/f=81/66/0 sis=82 pruub=14.980942726s) [1] r=-1 lpr=82 pi=[65,82)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 155.713256836s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:10 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 82 pg[10.f( v 51'1091 (0'0,51'1091] local-lis/les=80/81 n=6 ec=56/45 lis/c=80/65 les/c/f=81/66/0 sis=82 pruub=14.980880737s) [1] async=[1] r=-1 lpr=82 pi=[65,82)/1 crt=51'1091 mlcod 51'1091 active pruub 155.713348389s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:10 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 82 pg[10.f( v 51'1091 (0'0,51'1091] local-lis/les=80/81 n=6 ec=56/45 lis/c=80/65 les/c/f=81/66/0 sis=82 pruub=14.980735779s) [1] r=-1 lpr=82 pi=[65,82)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 155.713348389s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:10 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 82 pg[10.17( v 51'1091 (0'0,51'1091] local-lis/les=80/81 n=3 ec=56/45 lis/c=80/64 les/c/f=81/65/0 sis=82 pruub=14.980568886s) [1] async=[1] r=-1 lpr=82 pi=[64,82)/1 crt=51'1091 mlcod 51'1091 active pruub 155.713226318s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:10 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 82 pg[10.17( v 51'1091 (0'0,51'1091] local-lis/les=80/81 n=3 ec=56/45 lis/c=80/64 les/c/f=81/65/0 sis=82 pruub=14.980377197s) [1] r=-1 lpr=82 pi=[64,82)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 155.713226318s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:10.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:10 np0005479823 ceph-mon[74913]: from='mgr.14442 192.168.122.100:0/3496043196' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "mgr module enable", "module": "prometheus"}]': finished
Oct 10 05:48:10 np0005479823 python3.9[86834]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:48:10 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'crash'
Oct 10 05:48:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:10.821+0000 7f6000674140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Oct 10 05:48:10 np0005479823 ceph-mgr[75218]: mgr[py] Module crash has missing NOTIFY_TYPES member
Oct 10 05:48:10 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'dashboard'
Oct 10 05:48:10 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.8 scrub starts
Oct 10 05:48:10 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.8 scrub ok
Oct 10 05:48:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:11 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:11 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'devicehealth'
Oct 10 05:48:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:11 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:11 np0005479823 ceph-mgr[75218]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Oct 10 05:48:11 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'diskprediction_local'
Oct 10 05:48:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:11.422+0000 7f6000674140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Oct 10 05:48:11 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e83 e83: 3 total, 3 up, 3 in
Oct 10 05:48:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 14 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Oct 10 05:48:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Oct 10 05:48:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]:  from numpy import show_config as show_numpy_config
Oct 10 05:48:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:11.578+0000 7f6000674140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Oct 10 05:48:11 np0005479823 ceph-mgr[75218]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Oct 10 05:48:11 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'influx'
Oct 10 05:48:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:11.645+0000 7f6000674140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Oct 10 05:48:11 np0005479823 ceph-mgr[75218]: mgr[py] Module influx has missing NOTIFY_TYPES member
Oct 10 05:48:11 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'insights'
Oct 10 05:48:11 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'iostat'
Oct 10 05:48:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:11.785+0000 7f6000674140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Oct 10 05:48:11 np0005479823 ceph-mgr[75218]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Oct 10 05:48:11 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'k8sevents'
Oct 10 05:48:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 14 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:11 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.a scrub starts
Oct 10 05:48:11 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.a scrub ok
Oct 10 05:48:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:12 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8938002cb0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:12 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'localpool'
Oct 10 05:48:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:48:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:12.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:48:12 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'mds_autoscaler'
Oct 10 05:48:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:12 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:12 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:12 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'mirroring'
Oct 10 05:48:12 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'nfs'
Oct 10 05:48:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:12.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:12.800+0000 7f6000674140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Oct 10 05:48:12 np0005479823 ceph-mgr[75218]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Oct 10 05:48:12 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'orchestrator'
Oct 10 05:48:13 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.16 scrub starts
Oct 10 05:48:13 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.16 scrub ok
Oct 10 05:48:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:13.017+0000 7f6000674140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Oct 10 05:48:13 np0005479823 ceph-mgr[75218]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Oct 10 05:48:13 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'osd_perf_query'
Oct 10 05:48:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:13.092+0000 7f6000674140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Oct 10 05:48:13 np0005479823 ceph-mgr[75218]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Oct 10 05:48:13 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'osd_support'
Oct 10 05:48:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:13.157+0000 7f6000674140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Oct 10 05:48:13 np0005479823 ceph-mgr[75218]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Oct 10 05:48:13 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'pg_autoscaler'
Oct 10 05:48:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:13.232+0000 7f6000674140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Oct 10 05:48:13 np0005479823 ceph-mgr[75218]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Oct 10 05:48:13 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'progress'
Oct 10 05:48:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:13.304+0000 7f6000674140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Oct 10 05:48:13 np0005479823 ceph-mgr[75218]: mgr[py] Module progress has missing NOTIFY_TYPES member
Oct 10 05:48:13 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'prometheus'
Oct 10 05:48:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:13 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:13 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:13 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:13.642+0000 7f6000674140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Oct 10 05:48:13 np0005479823 ceph-mgr[75218]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Oct 10 05:48:13 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'rbd_support'
Oct 10 05:48:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:13.734+0000 7f6000674140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Oct 10 05:48:13 np0005479823 ceph-mgr[75218]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Oct 10 05:48:13 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'restful'
Oct 10 05:48:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:13 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:13 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'rgw'
Oct 10 05:48:14 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.b deep-scrub starts
Oct 10 05:48:14 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.b deep-scrub ok
Oct 10 05:48:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:14 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 14 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:14.174+0000 7f6000674140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Oct 10 05:48:14 np0005479823 ceph-mgr[75218]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Oct 10 05:48:14 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'rook'
Oct 10 05:48:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:14.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:14 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:14 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:14.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:14.766+0000 7f6000674140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Oct 10 05:48:14 np0005479823 ceph-mgr[75218]: mgr[py] Module rook has missing NOTIFY_TYPES member
Oct 10 05:48:14 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'selftest'
Oct 10 05:48:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:14.838+0000 7f6000674140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Oct 10 05:48:14 np0005479823 ceph-mgr[75218]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Oct 10 05:48:14 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'snap_schedule'
Oct 10 05:48:14 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:48:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:14.919+0000 7f6000674140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Oct 10 05:48:14 np0005479823 ceph-mgr[75218]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Oct 10 05:48:14 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'stats'
Oct 10 05:48:14 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.13 scrub starts
Oct 10 05:48:14 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.13 scrub ok
Oct 10 05:48:14 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'status'
Oct 10 05:48:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:15.076+0000 7f6000674140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Oct 10 05:48:15 np0005479823 ceph-mgr[75218]: mgr[py] Module status has missing NOTIFY_TYPES member
Oct 10 05:48:15 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'telegraf'
Oct 10 05:48:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:15.146+0000 7f6000674140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Oct 10 05:48:15 np0005479823 ceph-mgr[75218]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Oct 10 05:48:15 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'telemetry'
Oct 10 05:48:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:15.299+0000 7f6000674140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Oct 10 05:48:15 np0005479823 ceph-mgr[75218]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Oct 10 05:48:15 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'test_orchestrator'
Oct 10 05:48:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:15 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:15 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:15.527+0000 7f6000674140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Oct 10 05:48:15 np0005479823 ceph-mgr[75218]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Oct 10 05:48:15 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'volumes'
Oct 10 05:48:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:15 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8938003db0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:15.800+0000 7f6000674140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Oct 10 05:48:15 np0005479823 ceph-mgr[75218]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Oct 10 05:48:15 np0005479823 ceph-mgr[75218]: mgr[py] Loading python module 'zabbix'
Oct 10 05:48:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 2025-10-10T09:48:15.871+0000 7f6000674140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Oct 10 05:48:15 np0005479823 ceph-mgr[75218]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Oct 10 05:48:15 np0005479823 ceph-mgr[75218]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Oct 10 05:48:15 np0005479823 ceph-mgr[75218]: mgr load Constructed class from module: dashboard
Oct 10 05:48:15 np0005479823 ceph-mgr[75218]: ms_deliver_dispatch: unhandled message 0x56163567b860 mon_map magic: 0 from mon.1 v2:192.168.122.102:3300/0
Oct 10 05:48:15 np0005479823 ceph-mgr[75218]: [prometheus DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Oct 10 05:48:15 np0005479823 ceph-mgr[75218]: mgr load Constructed class from module: prometheus
Oct 10 05:48:15 np0005479823 ceph-mgr[75218]: [dashboard INFO root] server: ssl=no host=192.168.122.102 port=8443
Oct 10 05:48:15 np0005479823 ceph-mgr[75218]: [dashboard INFO root] Configured CherryPy, starting engine...
Oct 10 05:48:15 np0005479823 ceph-mgr[75218]: [dashboard INFO root] Starting engine...
Oct 10 05:48:15 np0005479823 ceph-mgr[75218]: [prometheus INFO root] server_addr: :: server_port: 9283
Oct 10 05:48:15 np0005479823 ceph-mgr[75218]: [prometheus INFO root] Starting engine...
Oct 10 05:48:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: [10/Oct/2025:09:48:15] ENGINE Bus STARTING
Oct 10 05:48:15 np0005479823 ceph-mgr[75218]: [prometheus INFO cherrypy.error] [10/Oct/2025:09:48:15] ENGINE Bus STARTING
Oct 10 05:48:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: CherryPy Checker:
Oct 10 05:48:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: The Application mounted at '' has an empty config.
Oct 10 05:48:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: 
Oct 10 05:48:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:15 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:15 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.a scrub starts
Oct 10 05:48:15 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.a scrub ok
Oct 10 05:48:15 np0005479823 ceph-mgr[75218]: [dashboard INFO root] Engine started...
Oct 10 05:48:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: [10/Oct/2025:09:48:15] ENGINE Serving on http://:::9283
Oct 10 05:48:15 np0005479823 ceph-mgr[75218]: [prometheus INFO cherrypy.error] [10/Oct/2025:09:48:15] ENGINE Serving on http://:::9283
Oct 10 05:48:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mgr-compute-2-gkrssp[75214]: [10/Oct/2025:09:48:15] ENGINE Bus STARTED
Oct 10 05:48:15 np0005479823 ceph-mgr[75218]: [prometheus INFO cherrypy.error] [10/Oct/2025:09:48:15] ENGINE Bus STARTED
Oct 10 05:48:15 np0005479823 ceph-mgr[75218]: [prometheus INFO root] Engine started.
Oct 10 05:48:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:16 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:16 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e84 e84: 3 total, 3 up, 3 in
Oct 10 05:48:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:16.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:16 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:16 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:16 np0005479823 systemd-logind[796]: New session 38 of user ceph-admin.
Oct 10 05:48:16 np0005479823 systemd[1]: Started Session 38 of User ceph-admin.
Oct 10 05:48:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:16.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:16 np0005479823 ceph-mon[74913]: Active manager daemon compute-0.xkdepb restarted
Oct 10 05:48:16 np0005479823 ceph-mon[74913]: Activating manager daemon compute-0.xkdepb
Oct 10 05:48:16 np0005479823 ceph-mon[74913]: Manager daemon compute-0.xkdepb is now available
Oct 10 05:48:16 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:16 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.xkdepb/mirror_snapshot_schedule"}]: dispatch
Oct 10 05:48:16 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.xkdepb/trash_purge_schedule"}]: dispatch
Oct 10 05:48:17 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.3 scrub starts
Oct 10 05:48:17 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.3 scrub ok
Oct 10 05:48:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:17 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:17 np0005479823 podman[87045]: 2025-10-10 09:48:17.364139558 +0000 UTC m=+0.052877455 container exec bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True)
Oct 10 05:48:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:17 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:17 np0005479823 podman[87045]: 2025-10-10 09:48:17.450110907 +0000 UTC m=+0.138848804 container exec_died bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Oct 10 05:48:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:17 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:17 np0005479823 podman[87163]: 2025-10-10 09:48:17.883889696 +0000 UTC m=+0.082478379 container exec 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 10 05:48:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:17 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8938003db0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:17 np0005479823 podman[87163]: 2025-10-10 09:48:17.916152293 +0000 UTC m=+0.114740926 container exec_died 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 10 05:48:17 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.9 scrub starts
Oct 10 05:48:18 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.9 scrub ok
Oct 10 05:48:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:18 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:18 np0005479823 podman[87254]: 2025-10-10 09:48:18.184985297 +0000 UTC m=+0.046844393 container exec c0c699e75157b12e12bd70966bb74a53cb85c47327c540646fc648e6218a8292 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, io.buildah.version=1.40.1)
Oct 10 05:48:18 np0005479823 podman[87254]: 2025-10-10 09:48:18.191378421 +0000 UTC m=+0.053237427 container exec_died c0c699e75157b12e12bd70966bb74a53cb85c47327c540646fc648e6218a8292 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 05:48:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:18.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:18 np0005479823 systemd[1]: session-37.scope: Deactivated successfully.
Oct 10 05:48:18 np0005479823 systemd[1]: session-37.scope: Consumed 7.918s CPU time.
Oct 10 05:48:18 np0005479823 systemd-logind[796]: Session 37 logged out. Waiting for processes to exit.
Oct 10 05:48:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:18 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:18 np0005479823 systemd-logind[796]: Removed session 37.
Oct 10 05:48:18 np0005479823 podman[87322]: 2025-10-10 09:48:18.378374448 +0000 UTC m=+0.053643029 container exec 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0 (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol)
Oct 10 05:48:18 np0005479823 podman[87322]: 2025-10-10 09:48:18.385244318 +0000 UTC m=+0.060512879 container exec_died 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0 (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol)
Oct 10 05:48:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:18 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:18 np0005479823 podman[87385]: 2025-10-10 09:48:18.594761122 +0000 UTC m=+0.067036037 container exec 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, io.openshift.expose-services=, description=keepalived for Ceph, release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, vcs-type=git, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, com.redhat.component=keepalived-container, io.buildah.version=1.28.2)
Oct 10 05:48:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:18.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:18 np0005479823 podman[87385]: 2025-10-10 09:48:18.609203232 +0000 UTC m=+0.081478147 container exec_died 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm, com.redhat.component=keepalived-container, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., release=1793, io.openshift.expose-services=, name=keepalived)
Oct 10 05:48:18 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e85 e85: 3 total, 3 up, 3 in
Oct 10 05:48:18 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Oct 10 05:48:18 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.8 scrub starts
Oct 10 05:48:19 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.8 scrub ok
Oct 10 05:48:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:19 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:19 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:19 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:19 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e86 e86: 3 total, 3 up, 3 in
Oct 10 05:48:19 np0005479823 ceph-mon[74913]: [10/Oct/2025:09:48:17] ENGINE Bus STARTING
Oct 10 05:48:19 np0005479823 ceph-mon[74913]: [10/Oct/2025:09:48:17] ENGINE Serving on http://192.168.122.100:8765
Oct 10 05:48:19 np0005479823 ceph-mon[74913]: [10/Oct/2025:09:48:18] ENGINE Serving on https://192.168.122.100:7150
Oct 10 05:48:19 np0005479823 ceph-mon[74913]: [10/Oct/2025:09:48:18] ENGINE Bus STARTED
Oct 10 05:48:19 np0005479823 ceph-mon[74913]: [10/Oct/2025:09:48:18] ENGINE Client ('192.168.122.100', 53560) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Oct 10 05:48:19 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Oct 10 05:48:19 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:19 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:19 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:19 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:19 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:48:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:19 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:19 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.2 scrub starts
Oct 10 05:48:19 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.2 scrub ok
Oct 10 05:48:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:20 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8938003db0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:20.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:20 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:20 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:48:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:20.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:48:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e87 e87: 3 total, 3 up, 3 in
Oct 10 05:48:20 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 87 pg[10.19( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=87 pruub=9.883892059s) [1] r=-1 lpr=87 pi=[65,87)/1 crt=51'1091 mlcod 0'0 active pruub 160.972961426s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:20 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 87 pg[10.19( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=87 pruub=9.883842468s) [1] r=-1 lpr=87 pi=[65,87)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 160.972961426s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:20 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 87 pg[10.9( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=87 pruub=9.875918388s) [1] r=-1 lpr=87 pi=[65,87)/1 crt=51'1091 mlcod 0'0 active pruub 160.966171265s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:20 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 87 pg[10.9( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=87 pruub=9.875630379s) [1] r=-1 lpr=87 pi=[65,87)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 160.966171265s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:20 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:20 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:20 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Oct 10 05:48:20 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Oct 10 05:48:20 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:20 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:20 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.17 scrub starts
Oct 10 05:48:20 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.17 scrub ok
Oct 10 05:48:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:21 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:21 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:21 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:21 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e88 e88: 3 total, 3 up, 3 in
Oct 10 05:48:21 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 88 pg[10.9( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=88) [1]/[2] r=0 lpr=88 pi=[65,88)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:21 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 88 pg[10.9( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=88) [1]/[2] r=0 lpr=88 pi=[65,88)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 05:48:21 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 88 pg[10.19( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=88) [1]/[2] r=0 lpr=88 pi=[65,88)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:21 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 88 pg[10.19( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=88) [1]/[2] r=0 lpr=88 pi=[65,88)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 05:48:21 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Oct 10 05:48:21 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:21 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:21 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Oct 10 05:48:21 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Oct 10 05:48:21 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:21 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:21 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Oct 10 05:48:21 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 05:48:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:21 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c0034e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:21 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.1f scrub starts
Oct 10 05:48:21 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.1f scrub ok
Oct 10 05:48:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:22 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c001090 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:22.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:22 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:22 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:22.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:22 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e89 e89: 3 total, 3 up, 3 in
Oct 10 05:48:22 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 89 pg[10.9( v 51'1091 (0'0,51'1091] local-lis/les=88/89 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=88) [1]/[2] async=[1] r=0 lpr=88 pi=[65,88)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:48:22 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 89 pg[10.19( v 51'1091 (0'0,51'1091] local-lis/les=88/89 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=88) [1]/[2] async=[1] r=0 lpr=88 pi=[65,88)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:48:22 np0005479823 ceph-mon[74913]: Updating compute-0:/etc/ceph/ceph.conf
Oct 10 05:48:22 np0005479823 ceph-mon[74913]: Updating compute-1:/etc/ceph/ceph.conf
Oct 10 05:48:22 np0005479823 ceph-mon[74913]: Updating compute-2:/etc/ceph/ceph.conf
Oct 10 05:48:22 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Oct 10 05:48:22 np0005479823 ceph-mon[74913]: Updating compute-2:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 05:48:22 np0005479823 ceph-mon[74913]: Updating compute-0:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 05:48:22 np0005479823 ceph-mon[74913]: Updating compute-1:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.conf
Oct 10 05:48:22 np0005479823 ceph-mon[74913]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Oct 10 05:48:22 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Oct 10 05:48:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:23 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:23 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:23 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e90 e90: 3 total, 3 up, 3 in
Oct 10 05:48:23 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 90 pg[10.19( v 51'1091 (0'0,51'1091] local-lis/les=88/89 n=5 ec=56/45 lis/c=88/65 les/c/f=89/66/0 sis=90 pruub=15.410189629s) [1] async=[1] r=-1 lpr=90 pi=[65,90)/1 crt=51'1091 mlcod 51'1091 active pruub 169.115798950s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:23 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 90 pg[10.19( v 51'1091 (0'0,51'1091] local-lis/les=88/89 n=5 ec=56/45 lis/c=88/65 les/c/f=89/66/0 sis=90 pruub=15.410112381s) [1] r=-1 lpr=90 pi=[65,90)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 169.115798950s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:23 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 90 pg[10.9( v 51'1091 (0'0,51'1091] local-lis/les=88/89 n=6 ec=56/45 lis/c=88/65 les/c/f=89/66/0 sis=90 pruub=15.409692764s) [1] async=[1] r=-1 lpr=90 pi=[65,90)/1 crt=51'1091 mlcod 51'1091 active pruub 169.115676880s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:23 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 90 pg[10.9( v 51'1091 (0'0,51'1091] local-lis/les=88/89 n=6 ec=56/45 lis/c=88/65 les/c/f=89/66/0 sis=90 pruub=15.409649849s) [1] r=-1 lpr=90 pi=[65,90)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 169.115676880s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:23 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c30 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:23 np0005479823 ceph-mon[74913]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Oct 10 05:48:23 np0005479823 ceph-mon[74913]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Oct 10 05:48:23 np0005479823 ceph-mon[74913]: Updating compute-2:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring
Oct 10 05:48:23 np0005479823 ceph-mon[74913]: Updating compute-0:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring
Oct 10 05:48:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:23 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:24 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89480014d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:24.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:24 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:24 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:24 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e91 e91: 3 total, 3 up, 3 in
Oct 10 05:48:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:24.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:24 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:48:24 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:24 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:24 np0005479823 ceph-mon[74913]: Updating compute-1:/var/lib/ceph/21f084a3-af34-5230-afe4-ea5cd24a55f4/config/ceph.client.admin.keyring
Oct 10 05:48:24 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:24 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:25 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:25 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e92 e92: 3 total, 3 up, 3 in
Oct 10 05:48:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:25 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c001090 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:25 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c001090 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:25 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:25 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:25 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:25 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:25 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 05:48:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:26 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:26.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:26 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:26 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e93 e93: 3 total, 3 up, 3 in
Oct 10 05:48:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:26.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:27 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:27 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:27 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c001090 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:27 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.1c scrub starts
Oct 10 05:48:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:27 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:27 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.1c scrub ok
Oct 10 05:48:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:28 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948001670 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:28.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:28 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:28 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:28.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:28 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.1a deep-scrub starts
Oct 10 05:48:28 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.1a deep-scrub ok
Oct 10 05:48:29 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e94 e94: 3 total, 3 up, 3 in
Oct 10 05:48:29 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 94 pg[10.1b( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=2 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=94 pruub=9.701593399s) [1] r=-1 lpr=94 pi=[65,94)/1 crt=51'1091 mlcod 0'0 active pruub 168.973251343s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:29 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 94 pg[10.1b( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=2 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=94 pruub=9.701540947s) [1] r=-1 lpr=94 pi=[65,94)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 168.973251343s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:29 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 94 pg[10.b( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=94 pruub=9.700680733s) [1] r=-1 lpr=94 pi=[65,94)/1 crt=51'1091 mlcod 0'0 active pruub 168.973251343s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:29 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 94 pg[10.b( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=94 pruub=9.700636864s) [1] r=-1 lpr=94 pi=[65,94)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 168.973251343s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:29 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Oct 10 05:48:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:29 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:29 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:29 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:29 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.17 scrub starts
Oct 10 05:48:29 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.17 scrub ok
Oct 10 05:48:29 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:48:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:29 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c001090 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:30 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Oct 10 05:48:30 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:30 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e95 e95: 3 total, 3 up, 3 in
Oct 10 05:48:30 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 95 pg[10.1b( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=2 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=95) [1]/[2] r=0 lpr=95 pi=[65,95)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:30 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 95 pg[10.1b( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=2 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=95) [1]/[2] r=0 lpr=95 pi=[65,95)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 05:48:30 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 95 pg[10.b( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=95) [1]/[2] r=0 lpr=95 pi=[65,95)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:30 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 95 pg[10.b( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=95) [1]/[2] r=0 lpr=95 pi=[65,95)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 05:48:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:30 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:30.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:30 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:30 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:30.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:30 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.1d scrub starts
Oct 10 05:48:30 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.1d scrub ok
Oct 10 05:48:31 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:31 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Oct 10 05:48:31 np0005479823 ceph-mon[74913]: Reconfiguring alertmanager.compute-0 (dependencies changed)...
Oct 10 05:48:31 np0005479823 ceph-mon[74913]: Reconfiguring daemon alertmanager.compute-0 on compute-0
Oct 10 05:48:31 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e96 e96: 3 total, 3 up, 3 in
Oct 10 05:48:31 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 96 pg[10.c( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=6 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=96 pruub=10.433774948s) [1] r=-1 lpr=96 pi=[72,96)/1 crt=51'1091 mlcod 0'0 active pruub 171.756149292s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:31 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 96 pg[10.c( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=6 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=96 pruub=10.433749199s) [1] r=-1 lpr=96 pi=[72,96)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 171.756149292s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:31 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 96 pg[10.1c( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=5 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=96 pruub=10.433111191s) [1] r=-1 lpr=96 pi=[72,96)/1 crt=51'1091 mlcod 0'0 active pruub 171.756118774s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:31 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 96 pg[10.1c( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=5 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=96 pruub=10.433096886s) [1] r=-1 lpr=96 pi=[72,96)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 171.756118774s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:31 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 96 pg[10.1b( v 51'1091 (0'0,51'1091] local-lis/les=95/96 n=2 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=95) [1]/[2] async=[1] r=0 lpr=95 pi=[65,95)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:48:31 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 96 pg[10.b( v 51'1091 (0'0,51'1091] local-lis/les=95/96 n=6 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=95) [1]/[2] async=[1] r=0 lpr=95 pi=[65,95)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:48:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:31 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:31 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:31 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948002050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:31 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.16 deep-scrub starts
Oct 10 05:48:31 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.16 deep-scrub ok
Oct 10 05:48:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:31 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:32 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Oct 10 05:48:32 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e97 e97: 3 total, 3 up, 3 in
Oct 10 05:48:32 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 97 pg[10.c( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=6 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=97) [1]/[2] r=0 lpr=97 pi=[72,97)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:32 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 97 pg[10.1c( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=5 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=97) [1]/[2] r=0 lpr=97 pi=[72,97)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:32 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 97 pg[10.c( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=6 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=97) [1]/[2] r=0 lpr=97 pi=[72,97)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 05:48:32 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 97 pg[10.b( v 51'1091 (0'0,51'1091] local-lis/les=95/96 n=6 ec=56/45 lis/c=95/65 les/c/f=96/66/0 sis=97 pruub=14.980080605s) [1] async=[1] r=-1 lpr=97 pi=[65,97)/1 crt=51'1091 mlcod 51'1091 active pruub 177.331207275s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:32 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 97 pg[10.1c( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=5 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=97) [1]/[2] r=0 lpr=97 pi=[72,97)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 05:48:32 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 97 pg[10.b( v 51'1091 (0'0,51'1091] local-lis/les=95/96 n=6 ec=56/45 lis/c=95/65 les/c/f=96/66/0 sis=97 pruub=14.979895592s) [1] r=-1 lpr=97 pi=[65,97)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 177.331207275s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:32 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 97 pg[10.1b( v 51'1091 (0'0,51'1091] local-lis/les=95/96 n=2 ec=56/45 lis/c=95/65 les/c/f=96/66/0 sis=97 pruub=14.974352837s) [1] async=[1] r=-1 lpr=97 pi=[65,97)/1 crt=51'1091 mlcod 51'1091 active pruub 177.326797485s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:32 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 97 pg[10.1b( v 51'1091 (0'0,51'1091] local-lis/les=95/96 n=2 ec=56/45 lis/c=95/65 les/c/f=96/66/0 sis=97 pruub=14.974159241s) [1] r=-1 lpr=97 pi=[65,97)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 177.326797485s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:32 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c001090 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:32.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:32 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:32 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:32.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:32 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.3 scrub starts
Oct 10 05:48:32 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.3 scrub ok
Oct 10 05:48:33 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Oct 10 05:48:33 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:33 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:33 np0005479823 ceph-mon[74913]: Reconfiguring grafana.compute-0 (dependencies changed)...
Oct 10 05:48:33 np0005479823 ceph-mon[74913]: Reconfiguring daemon grafana.compute-0 on compute-0
Oct 10 05:48:33 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e98 e98: 3 total, 3 up, 3 in
Oct 10 05:48:33 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 98 pg[10.c( v 51'1091 (0'0,51'1091] local-lis/les=97/98 n=6 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=97) [1]/[2] async=[1] r=0 lpr=97 pi=[72,97)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:48:33 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 98 pg[10.1c( v 51'1091 (0'0,51'1091] local-lis/les=97/98 n=5 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=97) [1]/[2] async=[1] r=0 lpr=97 pi=[72,97)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:48:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:33 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:33 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:33 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e99 e99: 3 total, 3 up, 3 in
Oct 10 05:48:33 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 99 pg[10.c( v 51'1091 (0'0,51'1091] local-lis/les=97/98 n=6 ec=56/45 lis/c=97/72 les/c/f=98/73/0 sis=99 pruub=15.675864220s) [1] async=[1] r=-1 lpr=99 pi=[72,99)/1 crt=51'1091 mlcod 51'1091 active pruub 179.389389038s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:33 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 99 pg[10.c( v 51'1091 (0'0,51'1091] local-lis/les=97/98 n=6 ec=56/45 lis/c=97/72 les/c/f=98/73/0 sis=99 pruub=15.675802231s) [1] r=-1 lpr=99 pi=[72,99)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 179.389389038s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:33 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 99 pg[10.1c( v 51'1091 (0'0,51'1091] local-lis/les=97/98 n=5 ec=56/45 lis/c=97/72 les/c/f=98/73/0 sis=99 pruub=15.674968719s) [1] async=[1] r=-1 lpr=99 pi=[72,99)/1 crt=51'1091 mlcod 51'1091 active pruub 179.389450073s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:33 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 99 pg[10.1c( v 51'1091 (0'0,51'1091] local-lis/les=97/98 n=5 ec=56/45 lis/c=97/72 les/c/f=98/73/0 sis=99 pruub=15.674754143s) [1] r=-1 lpr=99 pi=[72,99)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 179.389450073s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:33 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003c90 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:33 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.17 scrub starts
Oct 10 05:48:33 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.17 scrub ok
Oct 10 05:48:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:33 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948002050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:34 np0005479823 systemd-logind[796]: New session 39 of user zuul.
Oct 10 05:48:34 np0005479823 systemd[1]: Started Session 39 of User zuul.
Oct 10 05:48:34 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Oct 10 05:48:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:34 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:34.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:34 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:34 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:34 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e100 e100: 3 total, 3 up, 3 in
Oct 10 05:48:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:34.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:34 np0005479823 python3.9[88714]: ansible-ansible.legacy.ping Invoked with data=pong
Oct 10 05:48:34 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.13 scrub starts
Oct 10 05:48:34 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 9.13 scrub ok
Oct 10 05:48:34 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:48:35 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:35 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:35 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://192.168.122.100:3000"}]: dispatch
Oct 10 05:48:35 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:35 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:35 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e101 e101: 3 total, 3 up, 3 in
Oct 10 05:48:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:35 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c003a20 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:35 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.9 deep-scrub starts
Oct 10 05:48:35 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.9 deep-scrub ok
Oct 10 05:48:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:35 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003cb0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:36 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948002050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:36 np0005479823 python3.9[88889]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:48:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:36.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:36 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:36 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:36 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e102 e102: 3 total, 3 up, 3 in
Oct 10 05:48:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:36.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:36 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.e scrub starts
Oct 10 05:48:36 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.e scrub ok
Oct 10 05:48:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:37 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:37 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:37 np0005479823 python3.9[89047]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:48:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:37 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:37 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.3 scrub starts
Oct 10 05:48:37 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 11.3 scrub ok
Oct 10 05:48:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:37 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c003a20 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:38 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003cd0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:38.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:38 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:38 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:38 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:38 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:38 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 05:48:38 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Oct 10 05:48:38 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:38 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:38 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 05:48:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:38.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:38 np0005479823 python3.9[89200]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:48:38 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.d scrub starts
Oct 10 05:48:38 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.d scrub ok
Oct 10 05:48:39 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e103 e103: 3 total, 3 up, 3 in
Oct 10 05:48:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:39 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:39 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:39 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Oct 10 05:48:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:39 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003cd0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:39 np0005479823 python3.9[89356]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:48:39 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.16 scrub starts
Oct 10 05:48:39 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 8.16 scrub ok
Oct 10 05:48:39 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:48:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:39 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:40 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c003a20 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:40.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:40 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:40 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e104 e104: 3 total, 3 up, 3 in
Oct 10 05:48:40 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 104 pg[10.f( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=82/82 les/c/f=83/83/0 sis=104) [2] r=0 lpr=104 pi=[82,104)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:48:40 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 104 pg[10.1f( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=82/82 les/c/f=83/83/0 sis=104) [2] r=0 lpr=104 pi=[82,104)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:48:40 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Oct 10 05:48:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:48:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:40.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:48:40 np0005479823 python3.9[89507]: ansible-ansible.builtin.service_facts Invoked
Oct 10 05:48:40 np0005479823 network[89524]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 10 05:48:40 np0005479823 network[89525]: 'network-scripts' will be removed from distribution in near future.
Oct 10 05:48:40 np0005479823 network[89526]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 10 05:48:40 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.18 scrub starts
Oct 10 05:48:40 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 12.18 scrub ok
Oct 10 05:48:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:41 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:41 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:41 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e105 e105: 3 total, 3 up, 3 in
Oct 10 05:48:41 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 105 pg[10.1f( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=82/82 les/c/f=83/83/0 sis=105) [2]/[1] r=-1 lpr=105 pi=[82,105)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:41 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 105 pg[10.1f( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=82/82 les/c/f=83/83/0 sis=105) [2]/[1] r=-1 lpr=105 pi=[82,105)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:41 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 105 pg[10.f( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=82/82 les/c/f=83/83/0 sis=105) [2]/[1] r=-1 lpr=105 pi=[82,105)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:41 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 105 pg[10.f( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=82/82 les/c/f=83/83/0 sis=105) [2]/[1] r=-1 lpr=105 pi=[82,105)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:41 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Oct 10 05:48:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:41 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948002390 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:41 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.4 scrub starts
Oct 10 05:48:41 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.4 scrub ok
Oct 10 05:48:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:41 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003cf0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:42 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:48:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:42.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:48:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:42 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:42 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:42 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e106 e106: 3 total, 3 up, 3 in
Oct 10 05:48:42 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 106 pg[10.10( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=106) [2] r=0 lpr=106 pi=[56,106)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:48:42 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Oct 10 05:48:42 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Oct 10 05:48:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:48:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:42.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:48:42 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.13 scrub starts
Oct 10 05:48:42 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.13 scrub ok
Oct 10 05:48:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:43 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:43 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:43 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e107 e107: 3 total, 3 up, 3 in
Oct 10 05:48:43 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 107 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=105/82 les/c/f=106/83/0 sis=107) [2] r=0 lpr=107 pi=[82,107)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:43 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 107 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=105/82 les/c/f=106/83/0 sis=107) [2] r=0 lpr=107 pi=[82,107)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:48:43 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 107 pg[10.f( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=105/82 les/c/f=106/83/0 sis=107) [2] r=0 lpr=107 pi=[82,107)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:43 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 107 pg[10.f( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=6 ec=56/45 lis/c=105/82 les/c/f=106/83/0 sis=107) [2] r=0 lpr=107 pi=[82,107)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:48:43 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 107 pg[10.10( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=107) [2]/[1] r=-1 lpr=107 pi=[56,107)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:43 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 107 pg[10.10( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=56/56 les/c/f=57/57/0 sis=107) [2]/[1] r=-1 lpr=107 pi=[56,107)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:43 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:43 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.14 scrub starts
Oct 10 05:48:43 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.14 scrub ok
Oct 10 05:48:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:43 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948003f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:44 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003d10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:44.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:44 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:44 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:44 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e108 e108: 3 total, 3 up, 3 in
Oct 10 05:48:44 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 108 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=107/108 n=5 ec=56/45 lis/c=105/82 les/c/f=106/83/0 sis=107) [2] r=0 lpr=107 pi=[82,107)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:48:44 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 108 pg[10.f( v 51'1091 (0'0,51'1091] local-lis/les=107/108 n=6 ec=56/45 lis/c=105/82 les/c/f=106/83/0 sis=107) [2] r=0 lpr=107 pi=[82,107)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:48:44 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:44 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:48:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:48:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:44.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:48:44 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.11 scrub starts
Oct 10 05:48:44 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.11 scrub ok
Oct 10 05:48:44 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:48:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:45 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:45 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e109 e109: 3 total, 3 up, 3 in
Oct 10 05:48:45 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 109 pg[10.10( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=2 ec=56/45 lis/c=107/56 les/c/f=108/57/0 sis=109) [2] r=0 lpr=109 pi=[56,109)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:45 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 109 pg[10.10( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=2 ec=56/45 lis/c=107/56 les/c/f=108/57/0 sis=109) [2] r=0 lpr=109 pi=[56,109)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:48:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:45 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c003a20 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:45 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.3 scrub starts
Oct 10 05:48:45 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.3 scrub ok
Oct 10 05:48:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:45 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:46 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948003f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:46.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:46 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:46 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:46 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e110 e110: 3 total, 3 up, 3 in
Oct 10 05:48:46 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 110 pg[10.10( v 51'1091 (0'0,51'1091] local-lis/les=109/110 n=2 ec=56/45 lis/c=107/56 les/c/f=108/57/0 sis=109) [2] r=0 lpr=109 pi=[56,109)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:48:46 np0005479823 python3.9[89819]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:48:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:46.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:46 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.f scrub starts
Oct 10 05:48:46 np0005479823 ceph-osd[77423]: log_channel(cluster) log [DBG] : 10.f scrub ok
Oct 10 05:48:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:47 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:47 np0005479823 python3.9[89971]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:48:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:47 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:47 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8944003d30 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:47 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c003a20 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:48 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:48.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:48 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:48 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:48.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:48 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e111 e111: 3 total, 3 up, 3 in
Oct 10 05:48:48 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Oct 10 05:48:48 np0005479823 python3.9[90126]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:48:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:49 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:49 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:49 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:49 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Oct 10 05:48:49 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:48:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:49 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:50 np0005479823 python3.9[90285]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 05:48:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:50 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c003a20 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:48:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:50.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:48:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:50 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:50 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:50.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e112 e112: 3 total, 3 up, 3 in
Oct 10 05:48:50 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 112 pg[10.12( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=66/66 les/c/f=67/67/0 sis=112) [2] r=0 lpr=112 pi=[66,112)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:48:50 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Oct 10 05:48:51 np0005479823 python3.9[90396]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 05:48:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:51 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:51 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:51 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948003f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:51 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Oct 10 05:48:51 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e113 e113: 3 total, 3 up, 3 in
Oct 10 05:48:51 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 113 pg[10.12( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=66/66 les/c/f=67/67/0 sis=113) [2]/[0] r=-1 lpr=113 pi=[66,113)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:51 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 113 pg[10.12( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=66/66 les/c/f=67/67/0 sis=113) [2]/[0] r=-1 lpr=113 pi=[66,113)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:51 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:52 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:52.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:52 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:52 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:52.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:52 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e114 e114: 3 total, 3 up, 3 in
Oct 10 05:48:52 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 114 pg[10.13( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=114 pruub=9.957026482s) [0] r=-1 lpr=114 pi=[65,114)/1 crt=51'1091 mlcod 0'0 active pruub 192.967025757s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:52 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 114 pg[10.13( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=114 pruub=9.956990242s) [0] r=-1 lpr=114 pi=[65,114)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 192.967025757s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:52 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Oct 10 05:48:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:53 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:53 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:53 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e115 e115: 3 total, 3 up, 3 in
Oct 10 05:48:53 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 115 pg[10.12( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=4 ec=56/45 lis/c=113/66 les/c/f=114/67/0 sis=115) [2] r=0 lpr=115 pi=[66,115)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:53 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 115 pg[10.12( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=4 ec=56/45 lis/c=113/66 les/c/f=114/67/0 sis=115) [2] r=0 lpr=115 pi=[66,115)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:48:53 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 115 pg[10.13( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=115) [0]/[2] r=0 lpr=115 pi=[65,115)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:53 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 115 pg[10.13( v 51'1091 (0'0,51'1091] local-lis/les=65/66 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=115) [0]/[2] r=0 lpr=115 pi=[65,115)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 05:48:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:53 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c003a20 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:53 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Oct 10 05:48:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:53 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948003f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:54 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8938002810 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:54.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:54 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:54 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:54 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e116 e116: 3 total, 3 up, 3 in
Oct 10 05:48:54 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 116 pg[10.12( v 51'1091 (0'0,51'1091] local-lis/les=115/116 n=4 ec=56/45 lis/c=113/66 les/c/f=114/67/0 sis=115) [2] r=0 lpr=115 pi=[66,115)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:48:54 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 116 pg[10.13( v 51'1091 (0'0,51'1091] local-lis/les=115/116 n=5 ec=56/45 lis/c=65/65 les/c/f=66/66/0 sis=115) [0]/[2] async=[0] r=0 lpr=115 pi=[65,115)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:48:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:54.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:54 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e116 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:48:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:55 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:55 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e117 e117: 3 total, 3 up, 3 in
Oct 10 05:48:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 117 pg[10.13( v 51'1091 (0'0,51'1091] local-lis/les=115/116 n=5 ec=56/45 lis/c=115/65 les/c/f=116/66/0 sis=117 pruub=15.093819618s) [0] async=[0] r=-1 lpr=117 pi=[65,117)/1 crt=51'1091 mlcod 51'1091 active pruub 200.847595215s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:55 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 117 pg[10.13( v 51'1091 (0'0,51'1091] local-lis/les=115/116 n=5 ec=56/45 lis/c=115/65 les/c/f=116/66/0 sis=117 pruub=15.093749046s) [0] r=-1 lpr=117 pi=[65,117)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 200.847595215s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:55 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:55 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:56 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948003f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:56.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:56 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:56 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:56 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e118 e118: 3 total, 3 up, 3 in
Oct 10 05:48:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:48:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:56.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:48:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:57 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:57 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:57 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c001080 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:57 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c003a20 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:58 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:48:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:48:58.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:48:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:58 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:58 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:58 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Oct 10 05:48:58 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e119 e119: 3 total, 3 up, 3 in
Oct 10 05:48:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 119 pg[10.14( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=5 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=119 pruub=14.952677727s) [0] r=-1 lpr=119 pi=[72,119)/1 crt=51'1091 mlcod 0'0 active pruub 203.756973267s@ mbc={}] PeeringState::start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:58 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 119 pg[10.14( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=5 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=119 pruub=14.952638626s) [0] r=-1 lpr=119 pi=[72,119)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 203.756973267s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:48:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:48:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:48:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:48:58.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:48:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:48:59 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:48:59 2025: (VI_0) received an invalid passwd!
Oct 10 05:48:59 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Oct 10 05:48:59 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e120 e120: 3 total, 3 up, 3 in
Oct 10 05:48:59 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 120 pg[10.14( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=5 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=120) [0]/[2] r=0 lpr=120 pi=[72,120)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:48:59 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 120 pg[10.14( v 51'1091 (0'0,51'1091] local-lis/les=72/73 n=5 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=120) [0]/[2] r=0 lpr=120 pi=[72,120)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 05:48:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:59 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948003f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:48:59 np0005479823 systemd[82041]: Starting Mark boot as successful...
Oct 10 05:48:59 np0005479823 systemd[82041]: Finished Mark boot as successful.
Oct 10 05:48:59 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e120 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Oct 10 05:48:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:48:59 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c001ee0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:00 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c001ee0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:49:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:00.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:49:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:00 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:00 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:00 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Oct 10 05:49:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e121 e121: 3 total, 3 up, 3 in
Oct 10 05:49:00 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 121 pg[10.14( v 51'1091 (0'0,51'1091] local-lis/les=120/121 n=5 ec=56/45 lis/c=72/72 les/c/f=73/73/0 sis=120) [0]/[2] async=[0] r=0 lpr=120 pi=[72,120)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:49:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:49:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:00.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:49:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:01 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:01 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:01 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:01 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Oct 10 05:49:01 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e122 e122: 3 total, 3 up, 3 in
Oct 10 05:49:01 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 122 pg[10.14( v 51'1091 (0'0,51'1091] local-lis/les=120/121 n=5 ec=56/45 lis/c=120/72 les/c/f=121/73/0 sis=122 pruub=14.985898972s) [0] async=[0] r=-1 lpr=122 pi=[72,122)/1 crt=51'1091 mlcod 51'1091 active pruub 206.866821289s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:49:01 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 122 pg[10.14( v 51'1091 (0'0,51'1091] local-lis/les=120/121 n=5 ec=56/45 lis/c=120/72 les/c/f=121/73/0 sis=122 pruub=14.985840797s) [0] r=-1 lpr=122 pi=[72,122)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 206.866821289s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:49:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:01 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948003f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:02 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c001ee0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000029s ======
Oct 10 05:49:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:02.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct 10 05:49:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:02 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:02 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:02 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e123 e123: 3 total, 3 up, 3 in
Oct 10 05:49:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000029s ======
Oct 10 05:49:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:02.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct 10 05:49:02 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Oct 10 05:49:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:03 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:03 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:03 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c001ee0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:03 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Oct 10 05:49:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:03 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:04 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8948003f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000029s ======
Oct 10 05:49:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:04.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct 10 05:49:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:04 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:04 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000029s ======
Oct 10 05:49:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:04.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct 10 05:49:04 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:49:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:05 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:05 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:05 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c001ee0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:05 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c004340 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:06 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c004340 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:06.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:06 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:06 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:06.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:07 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:07 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:07 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c004340 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:07 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c004340 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:08 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f896800a2b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:08.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:08 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:08 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:08.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:08 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Oct 10 05:49:08 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e124 e124: 3 total, 3 up, 3 in
Oct 10 05:49:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:09 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:09 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:09 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c003900 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:09 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Oct 10 05:49:09 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e124 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:49:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:09 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c003900 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:10 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c003900 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000029s ======
Oct 10 05:49:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:10.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct 10 05:49:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:10 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:10 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:10.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:10 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Oct 10 05:49:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e125 e125: 3 total, 3 up, 3 in
Oct 10 05:49:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:11 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:11 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8930000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:11 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Oct 10 05:49:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:11 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f892c000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:12 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c003900 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000029s ======
Oct 10 05:49:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:12.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct 10 05:49:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:12 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:12 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000029s ======
Oct 10 05:49:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:12.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct 10 05:49:12 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Oct 10 05:49:12 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e126 e126: 3 total, 3 up, 3 in
Oct 10 05:49:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:13 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:13 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:13 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e127 e127: 3 total, 3 up, 3 in
Oct 10 05:49:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:13 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c0044c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:13 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Oct 10 05:49:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:13 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c0044c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:14 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f892c0016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:14.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:14 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:14 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:14 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e128 e128: 3 total, 3 up, 3 in
Oct 10 05:49:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:14.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:14 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:49:14 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Oct 10 05:49:14 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Oct 10 05:49:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:15 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:15 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e129 e129: 3 total, 3 up, 3 in
Oct 10 05:49:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:15 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f895c003900 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:15 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c0044c0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:16 np0005479823 kernel: ganesha.nfsd[90534]: segfault at 50 ip 00007f8a14c6a32e sp 00007f89d0ff8210 error 4 in libntirpc.so.5.8[7f8a14c4f000+2c000] likely on CPU 5 (core 0, socket 5)
Oct 10 05:49:16 np0005479823 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Oct 10 05:49:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[85050]: 10/10/2025 09:49:16 : epoch 68e8d61f : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f893c0044c0 fd 47 proxy ignored for local
Oct 10 05:49:16 np0005479823 systemd[1]: Created slice Slice /system/systemd-coredump.
Oct 10 05:49:16 np0005479823 systemd[1]: Started Process Core Dump (PID 90582/UID 0).
Oct 10 05:49:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:16.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:16 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:16 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:16 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e130 e130: 3 total, 3 up, 3 in
Oct 10 05:49:16 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Oct 10 05:49:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000030s ======
Oct 10 05:49:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:16.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct 10 05:49:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/094916 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 05:49:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:17 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:17 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:17 np0005479823 systemd-coredump[90583]: Process 85054 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 63:#012#0  0x00007f8a14c6a32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Oct 10 05:49:17 np0005479823 systemd[1]: systemd-coredump@0-90582-0.service: Deactivated successfully.
Oct 10 05:49:17 np0005479823 systemd[1]: systemd-coredump@0-90582-0.service: Consumed 1.182s CPU time.
Oct 10 05:49:17 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Oct 10 05:49:17 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e131 e131: 3 total, 3 up, 3 in
Oct 10 05:49:17 np0005479823 podman[90598]: 2025-10-10 09:49:17.558629243 +0000 UTC m=+0.027462861 container died c0c699e75157b12e12bd70966bb74a53cb85c47327c540646fc648e6218a8292 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0)
Oct 10 05:49:17 np0005479823 systemd[1]: var-lib-containers-storage-overlay-65c18ffc3984bb82f7acc157cc3b25e9b8553569bbeae84a5fa3da5f5bd939d9-merged.mount: Deactivated successfully.
Oct 10 05:49:17 np0005479823 podman[90598]: 2025-10-10 09:49:17.591254203 +0000 UTC m=+0.060087811 container remove c0c699e75157b12e12bd70966bb74a53cb85c47327c540646fc648e6218a8292 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, ceph=True, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 05:49:17 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Main process exited, code=exited, status=139/n/a
Oct 10 05:49:17 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Failed with result 'exit-code'.
Oct 10 05:49:17 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.693s CPU time.
Oct 10 05:49:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:18.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:18 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:18 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:18 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e132 e132: 3 total, 3 up, 3 in
Oct 10 05:49:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000029s ======
Oct 10 05:49:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:18.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct 10 05:49:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:19 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:19 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:19 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e133 e133: 3 total, 3 up, 3 in
Oct 10 05:49:19 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:49:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000029s ======
Oct 10 05:49:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:20.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct 10 05:49:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:20 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:20 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e134 e134: 3 total, 3 up, 3 in
Oct 10 05:49:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000029s ======
Oct 10 05:49:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:20.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct 10 05:49:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:21 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:21 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/094922 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 05:49:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:22.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:22 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:22 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:22.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:23 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:23 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:24.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:24 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:24 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:24 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Oct 10 05:49:24 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e135 e135: 3 total, 3 up, 3 in
Oct 10 05:49:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:24.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:24 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e135 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:49:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:25 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:25 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:25 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Oct 10 05:49:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:26.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:26 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:26 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000029s ======
Oct 10 05:49:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:26.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct 10 05:49:26 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Oct 10 05:49:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e136 e136: 3 total, 3 up, 3 in
Oct 10 05:49:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:27 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:27 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:27 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Oct 10 05:49:27 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Scheduled restart job, restart counter is at 1.
Oct 10 05:49:27 np0005479823 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 05:49:27 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.693s CPU time.
Oct 10 05:49:27 np0005479823 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 05:49:28 np0005479823 podman[90703]: 2025-10-10 09:49:28.13715586 +0000 UTC m=+0.036707547 container create dafcf9f07572197bf76ca7cc7bd5d0c51473523dfb9e3b5f50db1aea12308ef7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 05:49:28 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1b345d6ff3a1e4aeaa218ddb360f02e9bc3886d1e16b0ede0f4a70a77a5db6da/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Oct 10 05:49:28 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1b345d6ff3a1e4aeaa218ddb360f02e9bc3886d1e16b0ede0f4a70a77a5db6da/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 05:49:28 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1b345d6ff3a1e4aeaa218ddb360f02e9bc3886d1e16b0ede0f4a70a77a5db6da/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 05:49:28 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1b345d6ff3a1e4aeaa218ddb360f02e9bc3886d1e16b0ede0f4a70a77a5db6da/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.boccfy-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 05:49:28 np0005479823 podman[90703]: 2025-10-10 09:49:28.193260015 +0000 UTC m=+0.092811722 container init dafcf9f07572197bf76ca7cc7bd5d0c51473523dfb9e3b5f50db1aea12308ef7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Oct 10 05:49:28 np0005479823 podman[90703]: 2025-10-10 09:49:28.198277909 +0000 UTC m=+0.097829596 container start dafcf9f07572197bf76ca7cc7bd5d0c51473523dfb9e3b5f50db1aea12308ef7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 05:49:28 np0005479823 bash[90703]: dafcf9f07572197bf76ca7cc7bd5d0c51473523dfb9e3b5f50db1aea12308ef7
Oct 10 05:49:28 np0005479823 podman[90703]: 2025-10-10 09:49:28.121464258 +0000 UTC m=+0.021015965 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:49:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:28 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Oct 10 05:49:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:28 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Oct 10 05:49:28 np0005479823 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 05:49:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:28 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Oct 10 05:49:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:28 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Oct 10 05:49:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:28 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Oct 10 05:49:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:28 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Oct 10 05:49:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:28 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Oct 10 05:49:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:28 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 05:49:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:28.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:28 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:28 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:28.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:28 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e137 e137: 3 total, 3 up, 3 in
Oct 10 05:49:28 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 137 pg[10.1e( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=80/80 les/c/f=81/81/0 sis=137) [2] r=0 lpr=137 pi=[80,137)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:49:28 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Oct 10 05:49:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:29 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:29 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:29 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Oct 10 05:49:29 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e138 e138: 3 total, 3 up, 3 in
Oct 10 05:49:29 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 138 pg[10.1e( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=80/80 les/c/f=81/81/0 sis=138) [2]/[1] r=-1 lpr=138 pi=[80,138)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:49:29 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 138 pg[10.1e( empty local-lis/les=0/0 n=0 ec=56/45 lis/c=80/80 les/c/f=81/81/0 sis=138) [2]/[1] r=-1 lpr=138 pi=[80,138)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Oct 10 05:49:29 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:49:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000030s ======
Oct 10 05:49:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:30.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct 10 05:49:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:30 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:30 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:30.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e139 e139: 3 total, 3 up, 3 in
Oct 10 05:49:30 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 139 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=107/108 n=5 ec=56/45 lis/c=107/107 les/c/f=108/108/0 sis=139 pruub=9.683565140s) [1] r=-1 lpr=139 pi=[107,139)/1 crt=51'1091 mlcod 0'0 active pruub 230.735855103s@ mbc={}] PeeringState::start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:49:30 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 139 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=107/108 n=5 ec=56/45 lis/c=107/107 les/c/f=108/108/0 sis=139 pruub=9.683433533s) [1] r=-1 lpr=139 pi=[107,139)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 230.735855103s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:49:30 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Oct 10 05:49:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:31 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:31 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:31 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e140 e140: 3 total, 3 up, 3 in
Oct 10 05:49:31 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 140 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=107/108 n=5 ec=56/45 lis/c=107/107 les/c/f=108/108/0 sis=140) [1]/[2] r=0 lpr=140 pi=[107,140)/1 crt=51'1091 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:49:31 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 140 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=107/108 n=5 ec=56/45 lis/c=107/107 les/c/f=108/108/0 sis=140) [1]/[2] r=0 lpr=140 pi=[107,140)/1 crt=51'1091 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Oct 10 05:49:31 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 140 pg[10.1e( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=138/80 les/c/f=139/81/0 sis=140) [2] r=0 lpr=140 pi=[80,140)/1 luod=0'0 crt=51'1091 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:49:31 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 140 pg[10.1e( v 51'1091 (0'0,51'1091] local-lis/les=0/0 n=5 ec=56/45 lis/c=138/80 les/c/f=139/81/0 sis=140) [2] r=0 lpr=140 pi=[80,140)/1 crt=51'1091 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Oct 10 05:49:31 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Oct 10 05:49:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:32.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:32 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:32 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:32.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:32 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e141 e141: 3 total, 3 up, 3 in
Oct 10 05:49:32 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 141 pg[10.1e( v 51'1091 (0'0,51'1091] local-lis/les=140/141 n=5 ec=56/45 lis/c=138/80 les/c/f=139/81/0 sis=140) [2] r=0 lpr=140 pi=[80,140)/1 crt=51'1091 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:49:33 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 141 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=140/141 n=5 ec=56/45 lis/c=107/107 les/c/f=108/108/0 sis=140) [1]/[2] async=[1] r=0 lpr=140 pi=[107,140)/1 crt=51'1091 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Oct 10 05:49:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:33 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:33 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:33 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e142 e142: 3 total, 3 up, 3 in
Oct 10 05:49:33 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 142 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=140/141 n=5 ec=56/45 lis/c=140/107 les/c/f=141/108/0 sis=142 pruub=15.522687912s) [1] async=[1] r=-1 lpr=142 pi=[107,142)/1 crt=51'1091 mlcod 51'1091 active pruub 239.268997192s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Oct 10 05:49:33 np0005479823 ceph-osd[77423]: osd.2 pg_epoch: 142 pg[10.1f( v 51'1091 (0'0,51'1091] local-lis/les=140/141 n=5 ec=56/45 lis/c=140/107 les/c/f=141/108/0 sis=142 pruub=15.522623062s) [1] r=-1 lpr=142 pi=[107,142)/1 crt=51'1091 mlcod 0'0 unknown NOTIFY pruub 239.268997192s@ mbc={}] state<Start>: transitioning to Stray
Oct 10 05:49:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:34 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Oct 10 05:49:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:34 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Oct 10 05:49:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:34 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 05:49:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:34 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 05:49:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:34 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 05:49:34 np0005479823 python3.9[90943]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:49:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:34.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:34 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:34 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:34 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 e143: 3 total, 3 up, 3 in
Oct 10 05:49:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:34 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 05:49:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:34 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 05:49:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:34 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 05:49:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:34 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 05:49:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:34.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:34 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 05:49:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:34 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 05:49:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:34 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 05:49:34 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:49:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:35 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:35 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.527277) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089775527312, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 3147, "num_deletes": 252, "total_data_size": 9639001, "memory_usage": 9779192, "flush_reason": "Manual Compaction"}
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089775557627, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 6118759, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7607, "largest_seqno": 10749, "table_properties": {"data_size": 6104678, "index_size": 9103, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3909, "raw_key_size": 34854, "raw_average_key_size": 22, "raw_value_size": 6074200, "raw_average_value_size": 3949, "num_data_blocks": 395, "num_entries": 1538, "num_filter_entries": 1538, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089676, "oldest_key_time": 1760089676, "file_creation_time": 1760089775, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 30439 microseconds, and 12322 cpu microseconds.
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.557704) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 6118759 bytes OK
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.557735) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.559100) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.559117) EVENT_LOG_v1 {"time_micros": 1760089775559112, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.559146) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 9623836, prev total WAL file size 9623836, number of live WAL files 2.
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.561223) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(5975KB)], [18(10MB)]
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089775561299, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 17453573, "oldest_snapshot_seqno": -1}
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 4077 keys, 13425836 bytes, temperature: kUnknown
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089775629611, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 13425836, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13393323, "index_size": 21203, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10245, "raw_key_size": 104148, "raw_average_key_size": 25, "raw_value_size": 13313454, "raw_average_value_size": 3265, "num_data_blocks": 912, "num_entries": 4077, "num_filter_entries": 4077, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760089775, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.630082) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 13425836 bytes
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.632416) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 255.1 rd, 196.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(5.8, 10.8 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(5.0) write-amplify(2.2) OK, records in: 4615, records dropped: 538 output_compression: NoCompression
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.632450) EVENT_LOG_v1 {"time_micros": 1760089775632435, "job": 8, "event": "compaction_finished", "compaction_time_micros": 68416, "compaction_time_cpu_micros": 30577, "output_level": 6, "num_output_files": 1, "total_output_size": 13425836, "num_input_records": 4615, "num_output_records": 4077, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089775634690, "job": 8, "event": "table_file_deletion", "file_number": 20}
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089775638264, "job": 8, "event": "table_file_deletion", "file_number": 18}
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.561062) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.638355) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.638362) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.638365) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.638367) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 05:49:35 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:49:35.638368) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 05:49:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000030s ======
Oct 10 05:49:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:36.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct 10 05:49:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:36 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:36 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:36 np0005479823 python3.9[91232]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Oct 10 05:49:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:36.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/094936 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 05:49:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:37 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:37 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:37 np0005479823 python3.9[91386]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Oct 10 05:49:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:38.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:38 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:38 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:38 np0005479823 python3.9[91538]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:49:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:38.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:39 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:39 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:39 np0005479823 python3.9[91692]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Oct 10 05:49:39 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:49:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000029s ======
Oct 10 05:49:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:40.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:40 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:40 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:40.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000005:nfs.cephfs.1: -2
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Oct 10 05:49:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:40 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 05:49:40 np0005479823 python3.9[91852]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:49:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:41 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:41 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:41 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda7c000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:41 np0005479823 python3.9[92009]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:49:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:41 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0016c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:42 np0005479823 python3.9[92091]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:49:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:42 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda58000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:42.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:42 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:42 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000030s ======
Oct 10 05:49:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:42.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct 10 05:49:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:43 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:43 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:43 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0016c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:43 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda60000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:44 np0005479823 python3.9[92245]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Oct 10 05:49:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/094944 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 05:49:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:44 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000029s ======
Oct 10 05:49:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:44.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct 10 05:49:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:44 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:44 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:44.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:44 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:49:44 np0005479823 python3.9[92449]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Oct 10 05:49:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:45 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:45 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:45 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:45 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:46 np0005479823 python3.9[92633]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 10 05:49:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:46 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda60001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:46.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:46 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:46 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:46.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:46 np0005479823 python3.9[92786]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Oct 10 05:49:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:47 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:47 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:47 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:47 np0005479823 python3.9[92939]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 05:49:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:47 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda580016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:48 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:48 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000029s ======
Oct 10 05:49:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:48.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Oct 10 05:49:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:48 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:48.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:49 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:49:49 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:49:49 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 05:49:49 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:49:49 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:49:49 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 05:49:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:49 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:49 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:49 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda60001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:49 np0005479823 python3.9[93094]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:49:49 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:49:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:49 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54002140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:50 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda58002050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:50 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:50.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:50 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:50 np0005479823 python3.9[93246]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:49:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:50.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:51 np0005479823 python3.9[93351]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:49:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:51 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:51 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:51 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:51 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda60001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:52 np0005479823 python3.9[93503]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:49:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:52 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda60001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:52 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:52.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:52 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:52 np0005479823 python3.9[93581]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:49:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:52.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:53 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:53 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:53 np0005479823 python3.9[93735]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 05:49:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:53 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda58002050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:53 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:54 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54002140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:54 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000030s ======
Oct 10 05:49:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:54.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct 10 05:49:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:54 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:54 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:49:54 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:49:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:54.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:54 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:49:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:55 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:55 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:55 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda60003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:55 np0005479823 python3.9[93913]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:49:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:55 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda58002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:56 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:56 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:56.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:56 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:56 np0005479823 python3.9[94066]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Oct 10 05:49:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:56.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:57 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:57 np0005479823 python3.9[94217]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:49:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:57 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:57 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda540030a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:57 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda60003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:58 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda58002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:58 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:49:58.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:58 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:49:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:49:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:49:58.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:49:59 np0005479823 python3.9[94371]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:49:59 np0005479823 systemd[1]: Stopping Dynamic System Tuning Daemon...
Oct 10 05:49:59 np0005479823 systemd[1]: tuned.service: Deactivated successfully.
Oct 10 05:49:59 np0005479823 systemd[1]: Stopped Dynamic System Tuning Daemon.
Oct 10 05:49:59 np0005479823 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct 10 05:49:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:49:59 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:49:59 2025: (VI_0) received an invalid passwd!
Oct 10 05:49:59 np0005479823 systemd[1]: Started Dynamic System Tuning Daemon.
Oct 10 05:49:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:59 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:49:59 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:49:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:49:59 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda540030a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:00 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda60003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:00 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:00.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:00 np0005479823 python3.9[94532]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Oct 10 05:50:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:00 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:00 np0005479823 ceph-mon[74913]: overall HEALTH_OK
Oct 10 05:50:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:00.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:01 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:01 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:01 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda58002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:01 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:02 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:02 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:50:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:02.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:50:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:02 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:02.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:03 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:03 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:03 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda60003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:03 np0005479823 python3.9[94688]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:50:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:03 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda60003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:04 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda4c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:04 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:04.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:04 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:04 np0005479823 python3.9[94842]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:50:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:04.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:04 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:50:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:05 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:05 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:05 np0005479823 systemd[1]: session-39.scope: Deactivated successfully.
Oct 10 05:50:05 np0005479823 systemd[1]: session-39.scope: Consumed 1min 1.913s CPU time.
Oct 10 05:50:05 np0005479823 systemd-logind[796]: Session 39 logged out. Waiting for processes to exit.
Oct 10 05:50:05 np0005479823 systemd-logind[796]: Removed session 39.
Oct 10 05:50:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:05 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:06 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:06 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda60004070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:06 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:06.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:06 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:06.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:07 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:07 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:07 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda4c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:08 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:08 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:08 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:08.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:08 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:08.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:09 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:09 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:09 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda60004070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:09 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:50:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:10 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda4c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:10 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:10 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000031s ======
Oct 10 05:50:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:10.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Oct 10 05:50:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:10 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:10.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:11 np0005479823 systemd-logind[796]: New session 40 of user zuul.
Oct 10 05:50:11 np0005479823 systemd[1]: Started Session 40 of User zuul.
Oct 10 05:50:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:11 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:11 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:11 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:12 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda60004070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:12 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda4c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:12 np0005479823 python3.9[95055]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:50:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:12 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000031s ======
Oct 10 05:50:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:12.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Oct 10 05:50:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:12 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:12.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:13 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:13 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:13 np0005479823 python3.9[95213]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Oct 10 05:50:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:13 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:14 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:14 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda78001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:14 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:14.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:14 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:14.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:14 np0005479823 python3.9[95368]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 05:50:14 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:50:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:15 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:15 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:15 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda4c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:15 np0005479823 python3.9[95453]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 10 05:50:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:16 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:16 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:16 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:16.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:16 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:16.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:17 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:17 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:17 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda78001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:17 np0005479823 python3.9[95608]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 05:50:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:18 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda4c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:18 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:18 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:18.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:18 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:18.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:19 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:19 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:19 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:19 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:50:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:20 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda780027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:20 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda4c003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:20 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:20 np0005479823 python3.9[95763]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 10 05:50:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:20.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:20 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:20.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:21 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:21 np0005479823 python3.9[95918]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:50:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:21 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:21 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:22 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:22 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:22 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:22.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:22 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:22 np0005479823 python3.9[96070]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Oct 10 05:50:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000031s ======
Oct 10 05:50:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:22.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Oct 10 05:50:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:23 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:23 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:23 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda780027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:23 np0005479823 python3.9[96222]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:50:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:24 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:24 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:24 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000030s ======
Oct 10 05:50:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:24.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct 10 05:50:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:24 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:24.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:24 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:50:24 np0005479823 python3.9[96381]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 05:50:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:25 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:25 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:25 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda4c003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:26 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda780027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:26 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:26 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:26.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:26 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:26.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:27 np0005479823 python3.9[96537]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:50:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:27 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:27 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:27 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:28 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:28 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda4c003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:28 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:28.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:28 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:28.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:29 np0005479823 python3.9[96826]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 10 05:50:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:29 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:29 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:29 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda78003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:29 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:50:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:30 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:30 np0005479823 python3.9[96976]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:50:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:30 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:30 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:30.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:30 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000031s ======
Oct 10 05:50:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:30.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Oct 10 05:50:30 np0005479823 python3.9[97154]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 05:50:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:31 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:31 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:31 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda4c003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:32 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda78003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:32 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:32 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000031s ======
Oct 10 05:50:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:32.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Oct 10 05:50:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:32 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:32.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:33 np0005479823 python3.9[97312]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 05:50:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:33 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:33 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:33 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:34 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda4c003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:34 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda78003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:34 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:34.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:34 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:34.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:34 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:50:35 np0005479823 python3.9[97467]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:50:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:35 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:35 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:35 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda54003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:36 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda6c0023d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:50:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[90719]: 10/10/2025 09:50:36 : epoch 68e8d6a8 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fda58001040 fd 38 proxy ignored for local
Oct 10 05:50:36 np0005479823 kernel: ganesha.nfsd[97494]: segfault at 50 ip 00007fdb2e01632e sp 00007fdaee7fb210 error 4 in libntirpc.so.5.8[7fdb2dffb000+2c000] likely on CPU 2 (core 0, socket 2)
Oct 10 05:50:36 np0005479823 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Oct 10 05:50:36 np0005479823 systemd[1]: Started Process Core Dump (PID 97623/UID 0).
Oct 10 05:50:36 np0005479823 python3.9[97622]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Oct 10 05:50:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:36 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:36.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:36 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:36.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:37 np0005479823 systemd-coredump[97624]: Process 90723 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 56:#012#0  0x00007fdb2e01632e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Oct 10 05:50:37 np0005479823 systemd[1]: systemd-coredump@1-97623-0.service: Deactivated successfully.
Oct 10 05:50:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:37 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:37 np0005479823 systemd[1]: session-40.scope: Deactivated successfully.
Oct 10 05:50:37 np0005479823 systemd[1]: session-40.scope: Consumed 17.687s CPU time.
Oct 10 05:50:37 np0005479823 systemd-logind[796]: Session 40 logged out. Waiting for processes to exit.
Oct 10 05:50:37 np0005479823 systemd-logind[796]: Removed session 40.
Oct 10 05:50:37 np0005479823 podman[97655]: 2025-10-10 09:50:37.395284857 +0000 UTC m=+0.030011196 container died dafcf9f07572197bf76ca7cc7bd5d0c51473523dfb9e3b5f50db1aea12308ef7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid)
Oct 10 05:50:37 np0005479823 systemd[1]: var-lib-containers-storage-overlay-1b345d6ff3a1e4aeaa218ddb360f02e9bc3886d1e16b0ede0f4a70a77a5db6da-merged.mount: Deactivated successfully.
Oct 10 05:50:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:37 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:37 np0005479823 podman[97655]: 2025-10-10 09:50:37.433952286 +0000 UTC m=+0.068678605 container remove dafcf9f07572197bf76ca7cc7bd5d0c51473523dfb9e3b5f50db1aea12308ef7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True)
Oct 10 05:50:37 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Main process exited, code=exited, status=139/n/a
Oct 10 05:50:37 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Failed with result 'exit-code'.
Oct 10 05:50:37 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.232s CPU time.
Oct 10 05:50:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:38 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:38.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:38 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:38.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:39 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:39 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:39 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:50:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:40 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:40.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:40 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:40.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:41 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:41 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095042 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 05:50:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:42 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:42.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:42 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:42.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:43 np0005479823 systemd-logind[796]: New session 41 of user zuul.
Oct 10 05:50:43 np0005479823 systemd[1]: Started Session 41 of User zuul.
Oct 10 05:50:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:43 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:43 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:44 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:44.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:44 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:44 np0005479823 python3.9[97858]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:50:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:44.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:44 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:50:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:45 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:45 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:45 np0005479823 python3.9[98014]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 05:50:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:46 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:46 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:46.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:46.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:46 np0005479823 python3.9[98208]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:50:47 np0005479823 systemd[1]: session-41.scope: Deactivated successfully.
Oct 10 05:50:47 np0005479823 systemd[1]: session-41.scope: Consumed 2.431s CPU time.
Oct 10 05:50:47 np0005479823 systemd-logind[796]: Session 41 logged out. Waiting for processes to exit.
Oct 10 05:50:47 np0005479823 systemd-logind[796]: Removed session 41.
Oct 10 05:50:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:47 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:47 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:47 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Scheduled restart job, restart counter is at 2.
Oct 10 05:50:47 np0005479823 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 05:50:47 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.232s CPU time.
Oct 10 05:50:47 np0005479823 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 05:50:47 np0005479823 podman[98282]: 2025-10-10 09:50:47.976968245 +0000 UTC m=+0.044330913 container create ae52613467d0e214ed6da8b5b7944d06483a73f40967b1a4665c8b125febf9ec (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Oct 10 05:50:48 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93915b46d87e5adfc5a8e959d16f7d82e85ff82cf718b869d3a86bc987db93cb/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Oct 10 05:50:48 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93915b46d87e5adfc5a8e959d16f7d82e85ff82cf718b869d3a86bc987db93cb/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 05:50:48 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93915b46d87e5adfc5a8e959d16f7d82e85ff82cf718b869d3a86bc987db93cb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 05:50:48 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/93915b46d87e5adfc5a8e959d16f7d82e85ff82cf718b869d3a86bc987db93cb/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.boccfy-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 05:50:48 np0005479823 podman[98282]: 2025-10-10 09:50:48.049521647 +0000 UTC m=+0.116884345 container init ae52613467d0e214ed6da8b5b7944d06483a73f40967b1a4665c8b125febf9ec (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Oct 10 05:50:48 np0005479823 podman[98282]: 2025-10-10 09:50:47.955859252 +0000 UTC m=+0.023221960 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:50:48 np0005479823 podman[98282]: 2025-10-10 09:50:48.056789159 +0000 UTC m=+0.124151837 container start ae52613467d0e214ed6da8b5b7944d06483a73f40967b1a4665c8b125febf9ec (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 05:50:48 np0005479823 bash[98282]: ae52613467d0e214ed6da8b5b7944d06483a73f40967b1a4665c8b125febf9ec
Oct 10 05:50:48 np0005479823 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 05:50:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:50:48 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Oct 10 05:50:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:50:48 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Oct 10 05:50:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:50:48 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Oct 10 05:50:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:50:48 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Oct 10 05:50:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:50:48 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Oct 10 05:50:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:50:48 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Oct 10 05:50:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:50:48 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Oct 10 05:50:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:50:48 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 05:50:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:48 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:48 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:48.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000031s ======
Oct 10 05:50:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:48.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Oct 10 05:50:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:49 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:49 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:49 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:50:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:50 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:50 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:50.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:50.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:51 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:51 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:52 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:52 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:52.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000031s ======
Oct 10 05:50:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:52.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Oct 10 05:50:53 np0005479823 systemd-logind[796]: New session 42 of user zuul.
Oct 10 05:50:53 np0005479823 systemd[1]: Started Session 42 of User zuul.
Oct 10 05:50:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:53 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:53 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:50:54 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 05:50:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:50:54 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 05:50:54 np0005479823 python3.9[98574]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:50:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:54 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:54 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:54.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:54.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:54 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:50:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:55 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:55 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:55 np0005479823 python3.9[98762]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:50:55 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 05:50:55 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:50:55 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:50:55 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 05:50:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:56 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:56 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:56.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:56 np0005479823 python3.9[98918]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 05:50:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000030s ======
Oct 10 05:50:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:56.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct 10 05:50:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:57 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:57 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:58 np0005479823 python3.9[99004]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 05:50:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:58 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:58 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:50:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:50:58.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:50:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:50:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000030s ======
Oct 10 05:50:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:50:58.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Oct 10 05:50:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:50:59 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:50:59 2025: (VI_0) received an invalid passwd!
Oct 10 05:50:59 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:50:59 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:50:59 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:51:00 np0005479823 python3.9[99184]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:00 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:00 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:00.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:00.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:01 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:01 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:01 np0005479823 python3.9[99397]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:51:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:01 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:02 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095102 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 05:51:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:02 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:02 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:02 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:02.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:02 np0005479823 systemd[1]: session-20.scope: Deactivated successfully.
Oct 10 05:51:02 np0005479823 systemd[1]: session-20.scope: Consumed 8.834s CPU time.
Oct 10 05:51:02 np0005479823 systemd-logind[796]: Session 20 logged out. Waiting for processes to exit.
Oct 10 05:51:02 np0005479823 systemd-logind[796]: Removed session 20.
Oct 10 05:51:02 np0005479823 python3.9[99549]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:51:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:02.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:03 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:03 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:03 np0005479823 python3.9[99716]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:51:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:03 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:04 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150001680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:04 np0005479823 python3.9[99794]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:51:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:04 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1400016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:04 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:04 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:04.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000031s ======
Oct 10 05:51:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:04.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Oct 10 05:51:04 np0005479823 python3.9[99947]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:51:04 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:51:05 np0005479823 python3.9[100026]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:51:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:05 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:05 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:05 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:06 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:06 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150001680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:06 np0005479823 python3.9[100178]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:51:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:06 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:06 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:06.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:06.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:06 np0005479823 python3.9[100331]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:51:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:07 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:07 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:07 np0005479823 python3.9[100484]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:51:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:07 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1400016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:08 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144001f40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:08 np0005479823 python3.9[100636]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:51:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:08 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:08 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:08 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:08.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:08.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:09 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:09 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:09 np0005479823 python3.9[100790]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 05:51:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:09 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150001680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:09 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:51:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:10 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1400016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:10 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144001f40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:10 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:10 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:10.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:10.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:11 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:11 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095111 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 05:51:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:11 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:12 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:12 np0005479823 python3.9[100970]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:51:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:12 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:12 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:12 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:12.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:12.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:13 np0005479823 python3.9[101126]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:51:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:13 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:13 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:13 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144001f40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:14 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:14 np0005479823 python3.9[101278]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:51:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:14 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:14 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:14 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:14.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:14.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:14 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:51:15 np0005479823 python3.9[101432]: ansible-service_facts Invoked
Oct 10 05:51:15 np0005479823 network[101449]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 10 05:51:15 np0005479823 network[101450]: 'network-scripts' will be removed from distribution in near future.
Oct 10 05:51:15 np0005479823 network[101451]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 10 05:51:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:15 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:15 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:15 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:16 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144003340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:16 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:16 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:16 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000066s ======
Oct 10 05:51:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:16.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000066s
Oct 10 05:51:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:51:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:16.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:51:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:17 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:17 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:17 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:18 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:18 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144003340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:18 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:18 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:51:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:18.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:51:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:18.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:19 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:19 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:19 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:19 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:51:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:20 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:20 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 05:51:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:20 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:20 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:20 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:20.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:51:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:20.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:51:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:21 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:21 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:21 np0005479823 python3.9[101912]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 05:51:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:21 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:22 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:22 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:22 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:22 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:22.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:22.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:23 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 05:51:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:23 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 05:51:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:23 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:23 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:23 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:24 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:24 np0005479823 python3.9[102067]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Oct 10 05:51:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:24 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:24 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:24 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:51:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:24.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:51:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:24.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:24 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:51:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:25 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:25 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:25 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:25 np0005479823 python3.9[102221]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:51:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:26 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:26 np0005479823 python3.9[102299]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:51:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:26 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 05:51:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:26 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:26 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:26 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:26.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:26.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:27 np0005479823 python3.9[102453]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:51:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:27 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:27 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:27 np0005479823 python3.9[102531]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:51:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:27 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:28 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:28 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:28 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:28 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:28.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:28.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:29 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:29 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:29 np0005479823 python3.9[102685]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:51:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:29 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:29 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:51:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:30 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:30 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:30 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:30 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:30.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:51:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:30.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:51:31 np0005479823 python3.9[102839]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 05:51:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:31 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:31 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095131 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 05:51:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:31 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:32 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:32 np0005479823 python3.9[102950]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:51:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:32 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:32 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:32 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:32.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:32.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:33 np0005479823 systemd[1]: session-42.scope: Deactivated successfully.
Oct 10 05:51:33 np0005479823 systemd[1]: session-42.scope: Consumed 23.061s CPU time.
Oct 10 05:51:33 np0005479823 systemd-logind[796]: Session 42 logged out. Waiting for processes to exit.
Oct 10 05:51:33 np0005479823 systemd-logind[796]: Removed session 42.
Oct 10 05:51:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:33 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:33 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:33 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:34 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:34 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:34 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:34 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:34.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:34.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:34 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:51:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:35 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:35 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:35 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1340016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:36 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:36 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:36 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:36 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:36.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:51:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:36.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:51:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:37 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:37 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:37 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:38 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1340016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:38 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:38 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:38 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:38.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:38.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:38 np0005479823 systemd-logind[796]: New session 43 of user zuul.
Oct 10 05:51:38 np0005479823 systemd[1]: Started Session 43 of User zuul.
Oct 10 05:51:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:39 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:39 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:39 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:39 np0005479823 python3.9[103140]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:51:39 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:51:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:40 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:40 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1340016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:40 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:40 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:51:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:40.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:51:40 np0005479823 python3.9[103293]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:51:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:40.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095141 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 05:51:41 np0005479823 python3.9[103372]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:51:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:41 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:41 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:41 np0005479823 systemd[1]: session-43.scope: Deactivated successfully.
Oct 10 05:51:41 np0005479823 systemd[1]: session-43.scope: Consumed 1.628s CPU time.
Oct 10 05:51:41 np0005479823 systemd-logind[796]: Session 43 logged out. Waiting for processes to exit.
Oct 10 05:51:41 np0005479823 systemd-logind[796]: Removed session 43.
Oct 10 05:51:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:41 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:42 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:42 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:42 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:42 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:51:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:42.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:51:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:42.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:43 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:43 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:43 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:44 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:44 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:44 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:44 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:44.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:44.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:44 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:51:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:45 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:45 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:45 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:46 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:46 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:46 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:46 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:46.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:46 np0005479823 systemd-logind[796]: New session 44 of user zuul.
Oct 10 05:51:46 np0005479823 systemd[1]: Started Session 44 of User zuul.
Oct 10 05:51:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:46.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:47 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:47 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:47 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:47 np0005479823 python3.9[103556]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:51:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:48 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:48 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:48 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:48 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:48.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:48.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:49 np0005479823 python3.9[103714]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:51:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:49 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:49 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:49 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:49 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:51:50 np0005479823 python3.9[103889]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:51:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:50 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:50 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 05:51:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:50 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:50 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:50 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:50.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:50 np0005479823 python3.9[103967]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.s8f14h_c recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:51:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:50.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:51 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:51 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:51 np0005479823 python3.9[104146]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:51:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:51 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:52 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:52 np0005479823 python3.9[104224]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.plz3066q recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:51:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:52 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:52 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:52 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:51:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:52.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:51:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:52.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:53 np0005479823 python3.9[104378]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:51:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:53 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 05:51:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:53 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 05:51:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:53 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:53 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:53 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:53 np0005479823 python3.9[104530]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:51:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:54 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:54 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:54 np0005479823 python3.9[104608]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:51:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:54 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:54 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:51:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:54.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:51:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:54 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:51:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:51:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:54.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:51:55 np0005479823 python3.9[104762]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:51:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:55 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:55 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:55 np0005479823 python3.9[104840]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:51:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:55 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:56 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:56 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 05:51:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:56 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:56 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:56 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:51:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:56.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:51:56 np0005479823 python3.9[104992]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:51:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:51:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:56.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:51:57 np0005479823 python3.9[105146]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:51:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:57 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:57 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:57 np0005479823 python3.9[105224]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:51:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:57 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:58 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:58 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:58 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:58 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:51:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:51:58.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:51:58 np0005479823 python3.9[105376]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:51:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:51:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:51:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:51:58.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:51:59 np0005479823 python3.9[105456]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:51:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:51:59 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:51:59 2025: (VI_0) received an invalid passwd!
Oct 10 05:51:59 np0005479823 systemd[82041]: Created slice User Background Tasks Slice.
Oct 10 05:51:59 np0005479823 systemd[82041]: Starting Cleanup of User's Temporary Files and Directories...
Oct 10 05:51:59 np0005479823 systemd[82041]: Finished Cleanup of User's Temporary Files and Directories.
Oct 10 05:51:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:51:59 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:51:59 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:52:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:00 np0005479823 python3.9[105660]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:52:00 np0005479823 systemd[1]: Reloading.
Oct 10 05:52:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:00 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:00 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:00.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:00 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:52:00 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:52:00 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 05:52:00 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:52:00 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:52:00 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 05:52:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:00.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:01 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:01 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:01 np0005479823 python3.9[105882]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:52:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:01 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:01 np0005479823 python3.9[105960]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:52:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:02 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:02 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138001ba0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:02 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:02 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:52:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:02.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:52:02 np0005479823 python3.9[106114]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:52:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:52:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:02.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:52:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095203 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 05:52:03 np0005479823 python3.9[106193]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:52:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:03 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:03 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:03 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:04 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:04 np0005479823 python3.9[106346]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:52:04 np0005479823 systemd[1]: Reloading.
Oct 10 05:52:04 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:52:04 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:52:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:04 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:04 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:04 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:04 np0005479823 systemd[1]: Starting Create netns directory...
Oct 10 05:52:04 np0005479823 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 10 05:52:04 np0005479823 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 10 05:52:04 np0005479823 systemd[1]: Finished Create netns directory.
Oct 10 05:52:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:04.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:04 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:52:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:04.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:05 np0005479823 python3.9[106539]: ansible-ansible.builtin.service_facts Invoked
Oct 10 05:52:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:05 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:05 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:05 np0005479823 network[106556]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 10 05:52:05 np0005479823 network[106557]: 'network-scripts' will be removed from distribution in near future.
Oct 10 05:52:05 np0005479823 network[106558]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 10 05:52:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:05 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138001ba0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:06 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:06 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:06 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:06 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:52:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:06.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:52:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:06.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:07 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:52:07 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:52:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:07 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:07 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:07 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:08 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138001ba0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:08 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:08 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:08 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:08.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:08.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:09 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:09 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:09 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:09 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:52:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:10 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1640022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:10 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002ae0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:10 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:10 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:10.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:10 np0005479823 python3.9[106852]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:52:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:52:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:10.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:52:11 np0005479823 python3.9[106932]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:52:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:11 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:11 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:11 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:12 np0005479823 python3.9[107109]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:52:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:12 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:12 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1640022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:12 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:12 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:12.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:12 np0005479823 python3.9[107262]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:52:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:12.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:13 np0005479823 python3.9[107341]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:52:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:13 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:13 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:13 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002ae0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:14 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:14 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003c70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:14 np0005479823 python3.9[107493]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct 10 05:52:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:14 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:14 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:14 np0005479823 systemd[1]: Starting Time & Date Service...
Oct 10 05:52:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:14.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:14 np0005479823 systemd[1]: Started Time & Date Service.
Oct 10 05:52:14 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:52:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:14.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:15 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:15 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:15 np0005479823 python3.9[107651]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:52:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:15 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1640022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:16 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002ae0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:16 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:16 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:16 np0005479823 python3.9[107803]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:52:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:16 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:16.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:16 np0005479823 python3.9[107882]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:52:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:16.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:17 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:17 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:17 np0005479823 python3.9[108035]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:52:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:17 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003e10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:18 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:18 np0005479823 python3.9[108113]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.e7dgzvna recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:52:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:18 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002ae0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:18 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:18 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:18.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:18.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:19 np0005479823 python3.9[108267]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:52:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:19 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:19 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:19 np0005479823 python3.9[108345]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:52:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:19 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:19 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:52:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:20 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003e10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:20 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:20 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:20 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:20.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:20 np0005479823 python3.9[108497]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:52:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:52:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:20.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:52:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:21 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:21 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:21 np0005479823 python3[108652]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 10 05:52:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:21 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002ae0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:22 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002ae0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:22 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003e10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:22 np0005479823 python3.9[108804]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:52:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:22 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:22 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:52:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:22.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:52:22 np0005479823 python3.9[108883]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:52:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:52:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:22.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:52:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:23 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:23 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:23 np0005479823 python3.9[109036]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:52:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:23 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:24 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002ae0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:24 np0005479823 python3.9[109114]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:52:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:24 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002ae0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:24 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:24 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:24.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:24 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:52:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:24.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:25 np0005479823 python3.9[109269]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:52:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:25 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:25 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:25 np0005479823 python3.9[109347]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:52:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:25 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168002010 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:26 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:26 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:26 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:26 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:26.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:26 np0005479823 python3.9[109499]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:52:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:52:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:26.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:52:26 np0005479823 python3.9[109579]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:52:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:27 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:27 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:27 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003e10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:28 np0005479823 python3.9[109731]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:52:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:28 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003e10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:28 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002ae0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:28 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:28 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:28 np0005479823 python3.9[109809]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:52:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:28.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:28.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:29 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:29 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:29 np0005479823 python3.9[109963]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:52:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:29 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:29 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:52:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:30 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:30 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003e10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:30 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:30 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:30.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:30 np0005479823 python3.9[110118]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:52:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:52:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:30.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:52:31 np0005479823 python3.9[110272]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:52:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:31 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:31 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:31 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003e10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:31 np0005479823 python3.9[110449]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:52:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:32 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:32 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002c80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:32 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:32 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:32.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:32.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:33 np0005479823 python3.9[110603]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct 10 05:52:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:33 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:33 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:33 np0005479823 python3.9[110755]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct 10 05:52:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:33 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1680011e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:34 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:34 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:34 np0005479823 systemd[1]: session-44.scope: Deactivated successfully.
Oct 10 05:52:34 np0005479823 systemd[1]: session-44.scope: Consumed 29.547s CPU time.
Oct 10 05:52:34 np0005479823 systemd-logind[796]: Session 44 logged out. Waiting for processes to exit.
Oct 10 05:52:34 np0005479823 systemd-logind[796]: Removed session 44.
Oct 10 05:52:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:34 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:34 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:34.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:34 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:52:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:52:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:34.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:52:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:35 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:35 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:35 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002c80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:36 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1680011e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:36 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:36 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:36 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:36.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:36.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:37 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:37 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:37 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:38 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002c80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:38 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168001380 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:38 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:38 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:38.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:38.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:39 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:39 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:39 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:39 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:52:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:40 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:40 np0005479823 systemd-logind[796]: New session 45 of user zuul.
Oct 10 05:52:40 np0005479823 systemd[1]: Started Session 45 of User zuul.
Oct 10 05:52:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:40 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002c80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:40 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:40 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:52:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:40.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:52:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:40.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:41 np0005479823 python3.9[110943]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Oct 10 05:52:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:41 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:41 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:41 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1680095a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:41 np0005479823 python3.9[111095]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:52:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:42 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:42 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:42 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:42 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:42.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:42 np0005479823 python3.9[111250]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Oct 10 05:52:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:52:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:42.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:52:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:43 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:43 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:43 np0005479823 python3.9[111403]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.6f632o26 follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:52:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:43 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002c80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:44 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168009720 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:44 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:44 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:44 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:44 np0005479823 python3.9[111528]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.6f632o26 mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760089963.1590438-105-234758570087073/.source.6f632o26 _original_basename=.uep4gh6k follow=False checksum=2d908d3ce99ab235b2c2751c9a38992c3c685672 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:52:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:44.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:44 np0005479823 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct 10 05:52:44 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:52:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:44.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:45 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:45 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:45 np0005479823 python3.9[111684]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:52:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:45 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:46 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002c80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:46 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff16800a040 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:46 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:46 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:46.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:46 np0005479823 python3.9[111837]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCs576V3VvbSgv48Ml4JM3ripPY5VUVh8vdkDr1njjfd7J/WrQQkTf/D0b7+eGTXj3Y1fx1/haVrDafo7g0NqcSZX+zNUgTCnYPWafo7RMG4Q7ITVk1NPIkAC1cDUxHNeWhXaOkxCz96sTkO4aNW3uoFjsp2JkJtRJmHzT7q/bc0N9x7YcWh9vwRRBiOKlV8cWMHuHUzOlloEQLN67Dht1xHWr1eO/SITqUlWY13tc/54xQuo8nBQNNX9ArhMbJz2a9AoNVUAAYFF8hWFI5ES/GL9qsCp8dnmAtrY4Rc07QmHo1RkcjXe1f6D+vymRIP3YOqIjlWp0blCTfcCGno5lBa9f5JachIsogk+5+GYx4AAbWLyxxecfKzdCxrGnQlfFgldc1xDN1RG+8HwFEAuHQDWTCDUgF67FXSHy7aVxrdzU4046193/o3VKTpSaJmFldASxFgyUeujs56OgC0qYM0zKV4jOsMBcocVHvH/1FOPWIr81XXYvu6C/Ntd6sBj0=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGSf7pFS/S1SmUMk/yMobwR+LTaQZlAhBqo7Ido5r8dg#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBB1l0EOuMseZ7ulHkfzzVtKv+5A9EWRy+oXVB+t370vohhJoN3+lviS8xoR8GttJUcHVCaeioniRtOWysbNdC0I=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUnwO+j5aInA4FKMx5pWF8B0Zp6L17GsYV5RBbu6iT67LtXjwbz5nP4EC7t80boMHnS7DRNCAxF0FNMVhQ9o4+1E1n2mrUxxAw8YxcZTabu/lAqRb4I6RzmXdXSA9mF8O3onswi/KhJg6YUTFEWCuxWrMLco15IatKi+hNqcRUk1DreR2L/YN0W5qXkvj1z3aoph1h3Yn1lRjuQDrVHp6lCywixC2pHwYG+CrPyX+0PkXJg+JRvRdxNCIw0D0zOkJrnppmT8XpIj42JLRUGGV592XFVXHiEhZdOI2bdzPy490EfIbWF9Symqi/V5vf8SK9LMOscHXkD7jsT6VKzsUXyk6/IzzZ2TzhD173lt8HpRJyaZq4ME0ZSVYNyD58DN/CQ3xpO1c1E8Wp4fUswc4WHmb/eILnY0lDXOZt6Hb/e+K6RHu5e5GOo0KSfei/LyrqJkBQn2P8UkbJvrUh2bNw+whjvT5CmXd3rPCw+Xq3/K3Gpit1K/4pC0zGC+CQr7E=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILklS4uW4IrGY5dWZTg4VeKVeFB3jPeUpu/8f4D1+rd5#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCelD2lLiMWT09YjxTI9IfdSnHfdMuHKAAEYFKZmJg34mgwUIDqUQqoc9I6a7Ps9pRizY+UpHWL//lD7hvvhD5k=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDarlOcgDXqRdSww3oIuqu7nGIBJToNGSnU1ljOr6GTlHTxxOoTztIrvZrPaJA8w/ixztkhFZZSdRPw4meYayY05CNu9SneiL62twzDLDsqeDPAspkh69Ljj5aGCLf6GJDiK0m2h1jLDIFtXH3lIQE9781zA7ZQ8+/xeF4yRS1/Fb5CXDG+oi/J0veCffs6t0TYmrUfSgS2H2y0UxNu7C6GoQKRde1arPLOYexvlg2RjlWM6Ex4JCqTAd9EN330Kh4HUr3r46ET8mwi1mPndibbiW0heXgrg8FeV5hBqOxQsGgLEKpX1cNAz6Rr0C5Hg1xfGcsJtep88vbJFmMyV1jNowDtJCYpprqa16Nj35HBuuz7zbzVlIdeQhEJ9I4I7eNhUxlb2/XYRXy2hfsrM9D2TP7B+bVPLjlqgqy8stBhGBCtH32ppNsXHE6uGPHMovcz2VhbP/P3sp9NQV+hF2Q0RbBXrQZkEI9YJdhxQw5hyOqwfPrEEBFy8FpzSKfBAW0=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIC1nQuW/lbxVJxo9H20J7i0+Z6cHtufrF4VbA6zs724f#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBB0oTxSrAqx34tAubl7rouYPI7qhs6NhoDmGr3PTW1+mypEQw0EO+pZ99zSRnweC5RBoL080AgUKo7KN+v3LDHw=#012 create=True mode=0644 path=/tmp/ansible.6f632o26 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:52:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:47.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:47 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:47 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:47 np0005479823 python3.9[111990]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.6f632o26' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:52:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:47 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:48 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:48 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002c80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:48 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:48 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:48 np0005479823 python3.9[112144]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.6f632o26 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:52:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:48.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:49.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:49 np0005479823 systemd[1]: session-45.scope: Deactivated successfully.
Oct 10 05:52:49 np0005479823 systemd[1]: session-45.scope: Consumed 5.251s CPU time.
Oct 10 05:52:49 np0005479823 systemd-logind[796]: Session 45 logged out. Waiting for processes to exit.
Oct 10 05:52:49 np0005479823 systemd-logind[796]: Removed session 45.
Oct 10 05:52:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:49 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:49 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:49 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff16800a040 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:49 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:52:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:50 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:50 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:50 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:50 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:50.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:52:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:51.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:52:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:51 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:51 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:51 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002c80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:52 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff16800a040 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:52 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003fd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:52 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:52 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:52.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:53.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:53 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:53 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:53 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:54 np0005479823 systemd-logind[796]: New session 46 of user zuul.
Oct 10 05:52:54 np0005479823 systemd[1]: Started Session 46 of User zuul.
Oct 10 05:52:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:54 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002c80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:54 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff16800a040 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:54 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:54 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:54.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:54 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:52:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:55.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:55 np0005479823 python3.9[112356]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:52:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:55 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:55 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:55 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:56 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:56 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164003150 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:56 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:56 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:56 np0005479823 python3.9[112513]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct 10 05:52:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:56.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:57.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:57 np0005479823 python3.9[112669]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 05:52:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:57 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:57 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:57 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002c80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:58 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140004010 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:58 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:58 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:58 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:58 np0005479823 python3.9[112822]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.567895) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089978567935, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 2160, "num_deletes": 251, "total_data_size": 6238763, "memory_usage": 6311288, "flush_reason": "Manual Compaction"}
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089978584533, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 2509325, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10754, "largest_seqno": 12909, "table_properties": {"data_size": 2503062, "index_size": 3206, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1989, "raw_key_size": 15654, "raw_average_key_size": 20, "raw_value_size": 2489405, "raw_average_value_size": 3195, "num_data_blocks": 143, "num_entries": 779, "num_filter_entries": 779, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089776, "oldest_key_time": 1760089776, "file_creation_time": 1760089978, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 16675 microseconds, and 7914 cpu microseconds.
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.584573) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 2509325 bytes OK
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.584591) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.586159) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.586173) EVENT_LOG_v1 {"time_micros": 1760089978586169, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.586190) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 6229284, prev total WAL file size 6229284, number of live WAL files 2.
Oct 10 05:52:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:52:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:52:58.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.588334) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323532' seq:0, type:0; will stop at (end)
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(2450KB)], [21(12MB)]
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089978588373, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 15935161, "oldest_snapshot_seqno": -1}
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 4433 keys, 14286695 bytes, temperature: kUnknown
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089978690025, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 14286695, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14252792, "index_size": 21697, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11141, "raw_key_size": 111751, "raw_average_key_size": 25, "raw_value_size": 14167774, "raw_average_value_size": 3195, "num_data_blocks": 932, "num_entries": 4433, "num_filter_entries": 4433, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760089978, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.690227) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 14286695 bytes
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.691683) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 156.7 rd, 140.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.4, 12.8 +0.0 blob) out(13.6 +0.0 blob), read-write-amplify(12.0) write-amplify(5.7) OK, records in: 4856, records dropped: 423 output_compression: NoCompression
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.691701) EVENT_LOG_v1 {"time_micros": 1760089978691692, "job": 10, "event": "compaction_finished", "compaction_time_micros": 101710, "compaction_time_cpu_micros": 52067, "output_level": 6, "num_output_files": 1, "total_output_size": 14286695, "num_input_records": 4856, "num_output_records": 4433, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089978692158, "job": 10, "event": "table_file_deletion", "file_number": 23}
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089978694190, "job": 10, "event": "table_file_deletion", "file_number": 21}
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.588251) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.694298) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.694308) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.694312) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.694316) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 05:52:58 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:52:58.694320) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 05:52:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:52:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:52:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:52:59.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:52:59 np0005479823 python3.9[112977]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:52:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:52:59 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:52:59 2025: (VI_0) received an invalid passwd!
Oct 10 05:52:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:52:59 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138002c80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:52:59 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:53:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140004030 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:00 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:00 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:00 np0005479823 python3.9[113129]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:53:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:00.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:00 np0005479823 systemd[1]: session-46.scope: Deactivated successfully.
Oct 10 05:53:00 np0005479823 systemd[1]: session-46.scope: Consumed 3.922s CPU time.
Oct 10 05:53:00 np0005479823 systemd-logind[796]: Session 46 logged out. Waiting for processes to exit.
Oct 10 05:53:00 np0005479823 systemd-logind[796]: Removed session 46.
Oct 10 05:53:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:53:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:01.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:53:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:01 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:01 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:01 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:02 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:02 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144002470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:02 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:02 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:02.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:03.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:03 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:03 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:03 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:04 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:04 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:04 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:04 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:53:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:04.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:53:04 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:53:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:05.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:05 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:05 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:05 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144002470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:06 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:06 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:06 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:06 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:06.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:06 np0005479823 systemd-logind[796]: New session 47 of user zuul.
Oct 10 05:53:06 np0005479823 systemd[1]: Started Session 47 of User zuul.
Oct 10 05:53:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:07.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095307 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 05:53:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:07 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:07 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:07 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 05:53:07 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:53:07 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:53:07 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 05:53:07 np0005479823 python3.9[113385]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:53:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:07 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:08 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144002470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:08 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:08 np0005479823 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 05:53:08 np0005479823 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 05:53:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:08 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:08 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:08.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:08 np0005479823 python3.9[113555]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 05:53:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:09.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:09 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:09 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:09 np0005479823 python3.9[113640]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 10 05:53:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:09 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:09 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:53:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:10 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:10 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1440018c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:10 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:10 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:10.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:53:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:11.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:53:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:11 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:11 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:11 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:11 np0005479823 python3.9[113818]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:53:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:12 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:12 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:12 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:12 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:12.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:13.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:13 np0005479823 python3.9[113984]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 10 05:53:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:13 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:13 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:13 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1440018c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:13 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:53:13 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:53:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:14 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:14 np0005479823 python3.9[114146]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:53:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:14 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:14 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:14 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:53:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:14.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:53:14 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:53:14 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Oct 10 05:53:14 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:14.968147) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 05:53:14 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Oct 10 05:53:14 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089994968207, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 429, "num_deletes": 251, "total_data_size": 564009, "memory_usage": 572776, "flush_reason": "Manual Compaction"}
Oct 10 05:53:14 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Oct 10 05:53:14 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089994972445, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 372836, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12914, "largest_seqno": 13338, "table_properties": {"data_size": 370373, "index_size": 563, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 5935, "raw_average_key_size": 18, "raw_value_size": 365412, "raw_average_value_size": 1131, "num_data_blocks": 24, "num_entries": 323, "num_filter_entries": 323, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089979, "oldest_key_time": 1760089979, "file_creation_time": 1760089994, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Oct 10 05:53:14 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 4325 microseconds, and 2104 cpu microseconds.
Oct 10 05:53:14 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 05:53:14 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:14.972481) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 372836 bytes OK
Oct 10 05:53:14 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:14.972499) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Oct 10 05:53:14 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:14.973744) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Oct 10 05:53:14 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:14.973757) EVENT_LOG_v1 {"time_micros": 1760089994973753, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 05:53:14 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:14.973775) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 05:53:14 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 561283, prev total WAL file size 561283, number of live WAL files 2.
Oct 10 05:53:14 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 05:53:14 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:14.974352) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Oct 10 05:53:14 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 05:53:14 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(364KB)], [24(13MB)]
Oct 10 05:53:14 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089994974416, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 14659531, "oldest_snapshot_seqno": -1}
Oct 10 05:53:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:53:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:15.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:53:15 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4241 keys, 12701429 bytes, temperature: kUnknown
Oct 10 05:53:15 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089995042009, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 12701429, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12670525, "index_size": 19210, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10629, "raw_key_size": 108700, "raw_average_key_size": 25, "raw_value_size": 12590509, "raw_average_value_size": 2968, "num_data_blocks": 813, "num_entries": 4241, "num_filter_entries": 4241, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760089994, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Oct 10 05:53:15 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 05:53:15 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:15.042296) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 12701429 bytes
Oct 10 05:53:15 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:15.043523) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 216.6 rd, 187.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 13.6 +0.0 blob) out(12.1 +0.0 blob), read-write-amplify(73.4) write-amplify(34.1) OK, records in: 4756, records dropped: 515 output_compression: NoCompression
Oct 10 05:53:15 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:15.043549) EVENT_LOG_v1 {"time_micros": 1760089995043537, "job": 12, "event": "compaction_finished", "compaction_time_micros": 67668, "compaction_time_cpu_micros": 33043, "output_level": 6, "num_output_files": 1, "total_output_size": 12701429, "num_input_records": 4756, "num_output_records": 4241, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 05:53:15 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 05:53:15 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089995043752, "job": 12, "event": "table_file_deletion", "file_number": 26}
Oct 10 05:53:15 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 05:53:15 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760089995045987, "job": 12, "event": "table_file_deletion", "file_number": 24}
Oct 10 05:53:15 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:14.974213) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 05:53:15 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:15.046018) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 05:53:15 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:15.046022) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 05:53:15 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:15.046024) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 05:53:15 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:15.046026) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 05:53:15 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-09:53:15.046028) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 05:53:15 np0005479823 python3.9[114298]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:53:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:15 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:15 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:15 np0005479823 systemd[1]: session-47.scope: Deactivated successfully.
Oct 10 05:53:15 np0005479823 systemd[1]: session-47.scope: Consumed 6.018s CPU time.
Oct 10 05:53:15 np0005479823 systemd-logind[796]: Session 47 logged out. Waiting for processes to exit.
Oct 10 05:53:15 np0005479823 systemd-logind[796]: Removed session 47.
Oct 10 05:53:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:15 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 05:53:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:15 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:16 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1440018c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:16 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:16 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:16 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:16.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:17.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:17 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:17 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:17 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:18 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:18 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004230 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:18 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:18 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:53:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:18.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:53:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:18 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 05:53:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:18 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 05:53:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:18 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 05:53:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:53:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:19.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:53:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:19 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:19 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:19 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:19 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:53:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:20 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:20 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:20 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:20 np0005479823 systemd-logind[796]: New session 48 of user zuul.
Oct 10 05:53:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:20 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:20 np0005479823 systemd[1]: Started Session 48 of User zuul.
Oct 10 05:53:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:53:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:20.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:53:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:53:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:21.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:53:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:21 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:21 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:21 np0005479823 python3.9[114482]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:53:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:21 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 05:53:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:21 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004230 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:22 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004230 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:22 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:22 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:22 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:53:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:22.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:53:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:53:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:23.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:53:23 np0005479823 python3.9[114640]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:53:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:23 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:23 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:23 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:24 np0005479823 python3.9[114792]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:53:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:24 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:24 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:24 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:24 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:24.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:24 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:53:24 np0005479823 python3.9[114946]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:53:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:25.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:25 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:25 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:25 np0005479823 python3.9[115069]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090004.306494-155-188169185077042/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=82e52d2e0222fcf71d7bc250104afff621190352 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:53:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:25 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:26 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff144004230 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:26 np0005479823 python3.9[115221]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:53:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:26 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:26 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:26 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:26.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:26 np0005479823 python3.9[115345]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090005.8433385-155-16603967517101/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=6d432417c0c3c485924638569c72973f4b3272fb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:53:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:27.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095327 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 05:53:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:27 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:27 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:27 np0005479823 python3.9[115498]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:53:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:27 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:28 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:28 np0005479823 python3.9[115622]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090006.977299-155-271065487793152/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=22dd871d21e0e7808e7ed0de3e38963760611c24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:53:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:28 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:28 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:28 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:28.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:29 np0005479823 python3.9[115776]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:53:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:29.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:29 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:29 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:29 np0005479823 python3.9[115928]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:53:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:29 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:29 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:53:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:30 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:30 np0005479823 python3.9[116080]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:53:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:30 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:30 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:30 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:30.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:30 np0005479823 python3.9[116204]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090009.7904294-321-128687080074166/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=dd9d6a73b1e231095db4a2bfe6482df0f3a33661 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:53:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:31.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:31 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:31 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:31 np0005479823 python3.9[116357]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:53:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:31 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:32 np0005479823 python3.9[116505]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090011.0296671-321-97806507666249/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=abcc61006dfeb8ab87ea24afb3b53290e7b990dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:53:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:32 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:32 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:32 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:32 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:32.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:32 np0005479823 python3.9[116658]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:53:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:33.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:33 np0005479823 python3.9[116782]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090012.2780507-321-276740763128526/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=56bf5fb5f6d0ebd1ad6e0802c492f9ea9fbe1bf5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:53:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:33 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:33 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:33 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:33 np0005479823 python3.9[116934]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:53:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:34 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff164001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:34 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:34 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:34 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:34 np0005479823 python3.9[117087]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:53:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:34.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:34 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:53:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000034s ======
Oct 10 05:53:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:35.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Oct 10 05:53:35 np0005479823 python3.9[117241]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:53:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:35 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:35 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:35 np0005479823 python3.9[117364]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090014.8390055-480-278969805050227/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=7b7b2ca77b92e88bec61aff3421984fcd2e9a026 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:53:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:35 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:36 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:36 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:36 np0005479823 python3.9[117516]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:53:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:36 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:36 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:36.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:36 np0005479823 python3.9[117642]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090015.9769478-480-279426433375276/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=abcc61006dfeb8ab87ea24afb3b53290e7b990dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:53:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:37.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:37 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:37 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:37 np0005479823 python3.9[117794]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:53:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:37 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:38 np0005479823 python3.9[117917]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090017.1071286-480-49894045491253/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=4b3eb023242fa3e834b8d259dc59353292772111 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:53:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:38 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:38 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:38 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:38 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:38.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:39.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:39 np0005479823 python3.9[118071]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:53:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:39 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:39 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:39 np0005479823 python3.9[118223]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:53:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:39 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:39 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:53:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:40 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:40 np0005479823 python3.9[118346]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090019.4338968-646-160770295232989/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=588de6fcfc4f8f2f1febb9ce163ed2886e4b0ed4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:53:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:40 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:40 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:40 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:40.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:53:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:41.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:53:41 np0005479823 python3.9[118500]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:53:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:41 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:41 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:41 np0005479823 python3.9[118652]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:53:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:41 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:42 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:42 np0005479823 python3.9[118775]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090021.3195982-719-21212220630772/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=588de6fcfc4f8f2f1febb9ce163ed2886e4b0ed4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:53:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:42 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138004610 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:42 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:42 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:42.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:42 np0005479823 python3.9[118928]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:53:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:43.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:43 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:43 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:43 np0005479823 python3.9[119081]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:53:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:43 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:44 np0005479823 python3.9[119204]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090023.1106236-797-178952081481129/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=588de6fcfc4f8f2f1febb9ce163ed2886e4b0ed4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:53:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:44 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:44 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:44 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:44 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:44.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:44 np0005479823 python3.9[119357]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:53:44 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:53:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:45.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:45 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:45 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:45 np0005479823 python3.9[119510]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:53:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:45 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138004630 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:46 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:46 np0005479823 python3.9[119633]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090025.1443326-862-52910869475216/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=588de6fcfc4f8f2f1febb9ce163ed2886e4b0ed4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:53:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:46 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:46 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:46 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:46.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:46 np0005479823 python3.9[119786]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:53:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000034s ======
Oct 10 05:53:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:47.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Oct 10 05:53:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:47 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:47 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:47 np0005479823 python3.9[119939]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:53:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:47 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:48 np0005479823 python3.9[120062]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090027.0635226-927-179103326546534/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=588de6fcfc4f8f2f1febb9ce163ed2886e4b0ed4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:53:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:48 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138004650 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:48 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:48 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:48 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:48.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:48 np0005479823 python3.9[120215]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:53:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:49.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:49 np0005479823 python3.9[120368]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:53:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:49 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:49 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:49 np0005479823 python3.9[120491]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090028.8873005-993-86912624521205/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=588de6fcfc4f8f2f1febb9ce163ed2886e4b0ed4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:53:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:49 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134003240 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:49 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:53:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:50 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:50 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134003240 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:50 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:50 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:50.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000034s ======
Oct 10 05:53:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:51.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Oct 10 05:53:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:51 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:51 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:51 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138004670 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:52 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:52 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:52 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:52 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:52.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:53.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:53 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:53 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:53 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134003240 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:54 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff138004690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:54 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:54 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:54 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:54.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:54 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:53:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:55.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:55 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:55 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:55 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:56 np0005479823 systemd[1]: session-48.scope: Deactivated successfully.
Oct 10 05:53:56 np0005479823 systemd[1]: session-48.scope: Consumed 22.425s CPU time.
Oct 10 05:53:56 np0005479823 systemd-logind[796]: Session 48 logged out. Waiting for processes to exit.
Oct 10 05:53:56 np0005479823 systemd-logind[796]: Removed session 48.
Oct 10 05:53:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:56 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134003240 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:56 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380046b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:56 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:56 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:56.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:53:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:57.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:53:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:57 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:57 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:57 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380046b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:58 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:58 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134003240 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:53:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:58 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:58 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000034s ======
Oct 10 05:53:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:53:58.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Oct 10 05:53:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:53:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:53:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:53:59.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:53:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:53:59 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:53:59 2025: (VI_0) received an invalid passwd!
Oct 10 05:53:59 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:53:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:53:59 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380046b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:00 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:00 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:00 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:54:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:00.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:54:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:54:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:01.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:54:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:01 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:01 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:01 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:02 np0005479823 systemd-logind[796]: New session 49 of user zuul.
Oct 10 05:54:02 np0005479823 systemd[1]: Started Session 49 of User zuul.
Oct 10 05:54:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:02 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380046b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:02 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134003240 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:02 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:02 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:54:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:02.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:54:03 np0005479823 python3.9[120710]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:54:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:03.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:03 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:03 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:03 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:03 np0005479823 python3.9[120862]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:54:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:04 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:04 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:04 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:04 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:54:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:04.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:54:04 np0005479823 python3.9[120986]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090043.2944539-64-172248528122302/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=f4f20d3bcbb08befb7837fd0e595f186c33a7cc2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:54:04 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:54:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:54:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:05.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:54:05 np0005479823 python3.9[121139]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:54:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:05 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:05 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:05 np0005479823 python3.9[121262]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090044.921748-64-83293049704282/.source.conf _original_basename=ceph.conf follow=False checksum=1a4b9adde8f120db415fb0ad56382b109e0fedc1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:54:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:05 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380046d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:06 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff140003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:06 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134003240 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:06 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:06 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:54:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:06.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:54:06 np0005479823 systemd[1]: session-49.scope: Deactivated successfully.
Oct 10 05:54:06 np0005479823 systemd[1]: session-49.scope: Consumed 2.659s CPU time.
Oct 10 05:54:06 np0005479823 systemd-logind[796]: Session 49 logged out. Waiting for processes to exit.
Oct 10 05:54:06 np0005479823 systemd-logind[796]: Removed session 49.
Oct 10 05:54:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:54:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:07.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:54:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:07 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:07 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:07 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:08 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380046f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:08 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff150001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:08 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:08 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:08.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000034s ======
Oct 10 05:54:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:09.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Oct 10 05:54:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:09 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:09 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:09 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:54:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:09 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff134003240 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:10 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff168008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:10 np0005479823 kernel: ganesha.nfsd[105961]: segfault at 50 ip 00007ff21737e32e sp 00007ff1cf7fd210 error 4 in libntirpc.so.5.8[7ff217363000+2c000] likely on CPU 2 (core 0, socket 2)
Oct 10 05:54:10 np0005479823 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Oct 10 05:54:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[98297]: 10/10/2025 09:54:10 : epoch 68e8d6f8 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7ff1380046f0 fd 39 proxy ignored for local
Oct 10 05:54:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:10 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:10 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:10 np0005479823 systemd[1]: Started Process Core Dump (PID 121293/UID 0).
Oct 10 05:54:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:10.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:54:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:11.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:54:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:11 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:11 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:11 np0005479823 systemd-coredump[121294]: Process 98301 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 58:#012#0  0x00007ff21737e32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Oct 10 05:54:11 np0005479823 systemd[1]: systemd-coredump@2-121293-0.service: Deactivated successfully.
Oct 10 05:54:11 np0005479823 systemd[1]: systemd-coredump@2-121293-0.service: Consumed 1.140s CPU time.
Oct 10 05:54:11 np0005479823 podman[121301]: 2025-10-10 09:54:11.73984774 +0000 UTC m=+0.030648809 container died ae52613467d0e214ed6da8b5b7944d06483a73f40967b1a4665c8b125febf9ec (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.40.1)
Oct 10 05:54:11 np0005479823 systemd[1]: var-lib-containers-storage-overlay-93915b46d87e5adfc5a8e959d16f7d82e85ff82cf718b869d3a86bc987db93cb-merged.mount: Deactivated successfully.
Oct 10 05:54:11 np0005479823 podman[121301]: 2025-10-10 09:54:11.789780693 +0000 UTC m=+0.080581742 container remove ae52613467d0e214ed6da8b5b7944d06483a73f40967b1a4665c8b125febf9ec (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 05:54:11 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Main process exited, code=exited, status=139/n/a
Oct 10 05:54:11 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Failed with result 'exit-code'.
Oct 10 05:54:11 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.791s CPU time.
Oct 10 05:54:11 np0005479823 systemd-logind[796]: New session 50 of user zuul.
Oct 10 05:54:12 np0005479823 systemd[1]: Started Session 50 of User zuul.
Oct 10 05:54:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:12 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:12 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:12.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:13 np0005479823 python3.9[121523]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:54:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:54:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:13.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:54:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:13 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:13 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:13 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:54:13 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:54:13 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:54:13 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:54:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:14 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:14 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:14 np0005479823 python3.9[121759]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:54:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:14.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:14 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:54:14 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 05:54:14 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:54:14 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:54:14 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 05:54:15 np0005479823 python3.9[121913]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:54:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:15.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:15 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:15 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:16 np0005479823 python3.9[122063]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:54:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095416 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 05:54:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:16 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:16 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:16.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:54:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:17.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:54:17 np0005479823 python3.9[122217]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct 10 05:54:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:17 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:17 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:18 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:18 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:54:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:18.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:54:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:54:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:19.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:54:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:19 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:19 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:19 np0005479823 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Oct 10 05:54:19 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:54:19 np0005479823 python3.9[122376]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 05:54:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:20 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:20 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:20.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:20 np0005479823 python3.9[122486]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 05:54:20 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:54:20 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:54:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:54:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:21.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:54:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:21 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:21 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:21 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Scheduled restart job, restart counter is at 3.
Oct 10 05:54:21 np0005479823 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 05:54:21 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.791s CPU time.
Oct 10 05:54:21 np0005479823 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 05:54:22 np0005479823 podman[122559]: 2025-10-10 09:54:22.230206472 +0000 UTC m=+0.043005205 container create 42b56e31a57061dccff8e3670fdf444d91a3efcdd731ccdfa0e72b9ab7909387 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, ceph=True, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.vendor=CentOS)
Oct 10 05:54:22 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e64abc64c69623192b44a062c89724fdf3d77809147a47565255988d23e459a8/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Oct 10 05:54:22 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e64abc64c69623192b44a062c89724fdf3d77809147a47565255988d23e459a8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 05:54:22 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e64abc64c69623192b44a062c89724fdf3d77809147a47565255988d23e459a8/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 05:54:22 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e64abc64c69623192b44a062c89724fdf3d77809147a47565255988d23e459a8/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.boccfy-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 05:54:22 np0005479823 podman[122559]: 2025-10-10 09:54:22.28729108 +0000 UTC m=+0.100089843 container init 42b56e31a57061dccff8e3670fdf444d91a3efcdd731ccdfa0e72b9ab7909387 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Oct 10 05:54:22 np0005479823 podman[122559]: 2025-10-10 09:54:22.292532832 +0000 UTC m=+0.105331565 container start 42b56e31a57061dccff8e3670fdf444d91a3efcdd731ccdfa0e72b9ab7909387 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0)
Oct 10 05:54:22 np0005479823 bash[122559]: 42b56e31a57061dccff8e3670fdf444d91a3efcdd731ccdfa0e72b9ab7909387
Oct 10 05:54:22 np0005479823 podman[122559]: 2025-10-10 09:54:22.212836402 +0000 UTC m=+0.025635155 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:54:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:22 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Oct 10 05:54:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:22 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Oct 10 05:54:22 np0005479823 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 05:54:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:22 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Oct 10 05:54:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:22 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Oct 10 05:54:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:22 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Oct 10 05:54:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:22 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Oct 10 05:54:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:22 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Oct 10 05:54:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:22 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 05:54:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:22 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:22 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:22.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:23.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:23 np0005479823 python3.9[122746]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 10 05:54:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:23 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:23 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:24 np0005479823 python3[122901]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Oct 10 05:54:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:24 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:24 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:24.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:24 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:54:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:25.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:25 np0005479823 python3.9[123055]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:54:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:25 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:25 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:26 np0005479823 python3.9[123207]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:54:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:26 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:26 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:54:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:26.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:54:26 np0005479823 python3.9[123286]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:54:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:54:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:27.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:54:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:27 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:27 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:27 np0005479823 python3.9[123439]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:54:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095427 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 05:54:28 np0005479823 python3.9[123517]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.tczzimlo recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:54:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:28 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 05:54:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:28 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 05:54:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:28 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 05:54:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:28 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:28 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:54:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:28.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:54:28 np0005479823 python3.9[123671]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:54:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:29.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:29 np0005479823 python3.9[123749]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:54:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:29 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:29 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:29 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:54:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:30 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:30 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:30.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:30 np0005479823 python3.9[123902]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:54:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:54:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:31.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:54:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:31 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:31 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:31 np0005479823 python3[124056]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 10 05:54:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:32 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:32 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:32 np0005479823 python3.9[124233]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:54:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:54:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:32.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:54:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:32 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 05:54:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:32 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 05:54:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:32 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 05:54:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:33 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 05:54:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:33.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:33 np0005479823 python3.9[124360]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090072.1083715-433-70134766167542/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:54:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:33 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:33 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:33 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 05:54:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:33 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 05:54:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:33 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 05:54:34 np0005479823 python3.9[124512]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:54:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:34 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:34 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:54:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:34.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:54:34 np0005479823 python3.9[124638]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090073.6032171-478-245081298478511/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:54:34 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:54:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:54:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:35.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:54:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:35 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:35 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:35 np0005479823 python3.9[124791]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:54:36 np0005479823 python3.9[124916]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090075.2725697-524-125248784544233/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:54:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:36 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:36 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:36.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:37.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:37 np0005479823 python3.9[125070]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:54:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:37 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:37 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:37 np0005479823 python3.9[125195]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090076.7601483-568-98735656420959/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:54:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:38 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:38 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:54:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:38.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:54:39 np0005479823 python3.9[125349]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:54:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:39.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:39 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:39 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 05:54:39 np0005479823 python3.9[125486]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090078.411788-613-161270723694536/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:54:39 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:54:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:39 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c0000df0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:40 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c0000df0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:40 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:40 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:40 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998000b60 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:40.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:40 np0005479823 python3.9[125642]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:54:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:41.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:41 np0005479823 python3.9[125795]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:54:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:41 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:41 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:41 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b4001c00 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:42 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998000b60 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:42 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:42 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095442 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 05:54:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:42 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c0002070 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:42 np0005479823 python3.9[125950]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:54:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:42 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 05:54:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:42 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 05:54:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:42.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:54:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:43.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:54:43 np0005479823 python3.9[126104]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:54:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:43 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:43 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:43 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c000ea0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:44 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b4001c00 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:44 np0005479823 python3.9[126257]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:54:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:44 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:44 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:44 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c0002070 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:44.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:44 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:54:45 np0005479823 python3.9[126413]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:54:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:45.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:45 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:45 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:45 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 05:54:45 np0005479823 python3.9[126568]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:54:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:45 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998001b40 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:46 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c0019c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:46 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:46 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:46 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b4001c00 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:46.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:47 np0005479823 python3.9[126720]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:54:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:54:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:47.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:54:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:47 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:47 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095447 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 05:54:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:47 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c0002070 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:48 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998001b40 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:48 np0005479823 python3.9[126873]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:c0:16:5a:16" external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:54:48 np0005479823 ovs-vsctl[126874]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:c0:16:5a:16 external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Oct 10 05:54:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:48 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:48 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:48 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c0019c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:48.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:49.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:49 np0005479823 python3.9[127028]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:54:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:49 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:49 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:49 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:54:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:49 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b4001c00 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:50 np0005479823 python3.9[127183]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:54:50 np0005479823 ovs-vsctl[127184]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Oct 10 05:54:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:50 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c0002070 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:50 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:50 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:50 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998002b10 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:50.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:51 np0005479823 python3.9[127336]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:54:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:51.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:51 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:51 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:51 np0005479823 python3.9[127490]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:54:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:51 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c0019c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:52 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b4001c00 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:52 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:52 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:52 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c0009990 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:52 np0005479823 python3.9[127667]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:54:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:52.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:53 np0005479823 python3.9[127747]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:54:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:54:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:53.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:54:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:53 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:53 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:53 np0005479823 python3.9[127899]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:54:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:53 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998002b10 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:54 np0005479823 python3.9[127977]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:54:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:54 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c002e50 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:54 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:54 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:54 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c002e50 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:54.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:54 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:54:55 np0005479823 python3.9[128131]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:54:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:54:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:55.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:54:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:55 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:55 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:55 np0005479823 python3.9[128283]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:54:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:55 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c0009990 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:56 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998002b10 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:56 np0005479823 python3.9[128361]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:54:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:56 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:56 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:56 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998002b10 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:56.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:57 np0005479823 python3.9[128515]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:54:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:57.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:57 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:57 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:57 np0005479823 python3.9[128593]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:54:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:57 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:58 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c0009990 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:58 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:58 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:58 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998002b10 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:54:58 np0005479823 python3.9[128745]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:54:58 np0005479823 systemd[1]: Reloading.
Oct 10 05:54:58 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:54:58 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:54:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:54:58.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:54:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:54:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:54:59.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:54:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:54:59 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:54:59 2025: (VI_0) received an invalid passwd!
Oct 10 05:54:59 np0005479823 python3.9[128936]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:54:59 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:54:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:54:59 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998002b10 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:00 np0005479823 python3.9[129014]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:55:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:00 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:00 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:00 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:00 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:00.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:01 np0005479823 python3.9[129168]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:55:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:01.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:01 np0005479823 python3.9[129246]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:55:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:01 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:01 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:01 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:02 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:02 np0005479823 python3.9[129398]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:55:02 np0005479823 systemd[1]: Reloading.
Oct 10 05:55:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:02 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:02 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:02 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:02 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:55:02 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:55:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:02.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:02 np0005479823 systemd[1]: Starting Create netns directory...
Oct 10 05:55:02 np0005479823 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 10 05:55:02 np0005479823 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 10 05:55:02 np0005479823 systemd[1]: Finished Create netns directory.
Oct 10 05:55:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:55:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:03.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:55:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:03 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:03 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:03 np0005479823 python3.9[129595]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:55:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:03 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:04 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c002e50 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:04 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:04 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:04 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c002e50 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:04 np0005479823 python3.9[129747]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:55:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:04.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:04 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:55:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:05.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:05 np0005479823 python3.9[129872]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090104.0617633-1366-121764447244985/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:55:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:05 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:05 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:06 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:06 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:06 np0005479823 python3.9[130024]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:55:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:06 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:06 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:06 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998003c10 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:06.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:55:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:07.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:55:07 np0005479823 python3.9[130178]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:55:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:07 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:07 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:07 np0005479823 python3.9[130301]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090106.755449-1441-17617002855457/.source.json _original_basename=.acs83w8s follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:55:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:08 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998003c10 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:08 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:08 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:08 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:08 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:55:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:08.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:55:08 np0005479823 python3.9[130454]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:55:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:55:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:09.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:55:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:09 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:09 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:09 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:55:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:10 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998003c10 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:10 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f50 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:10 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:10 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:10 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:55:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:10.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:55:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:55:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:11.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:55:11 np0005479823 python3.9[130884]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Oct 10 05:55:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:11 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:11 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:12 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:12 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:12 np0005479823 python3.9[131064]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 10 05:55:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:12 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:12 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:12 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a8000fa0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:12.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:55:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:13.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:55:13 np0005479823 python3.9[131218]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 10 05:55:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:13 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:13 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:14 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:14 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:14 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:14 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:14 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:55:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:14.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:55:14 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:55:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:15.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:15 np0005479823 python3[131397]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 10 05:55:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:15 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:15 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:16 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a8001aa0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:16 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:16 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:16 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:16 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:55:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:16.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:55:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:55:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:17.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:55:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:17 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:17 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:18 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:18 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a8001aa0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:18 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:18 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:18 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:55:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:18.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:55:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:55:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:19.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:55:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:19 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:19 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:19 np0005479823 podman[131410]: 2025-10-10 09:55:19.746151897 +0000 UTC m=+4.366179383 image pull 70c92fb64e1eda6ef063d34e60e9a541e44edbaa51e757e8304331202c76a3a7 quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857
Oct 10 05:55:19 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 05:55:19 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 2498 writes, 14K keys, 2498 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.06 MB/s#012Cumulative WAL: 2498 writes, 2498 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2498 writes, 14K keys, 2498 commit groups, 1.0 writes per commit group, ingest: 37.80 MB, 0.06 MB/s#012Interval WAL: 2498 writes, 2498 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    178.0      0.12              0.05         6    0.019       0      0       0.0       0.0#012  L6      1/0   12.11 MB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   2.9    176.1    154.4      0.39              0.18         5    0.078     21K   2261       0.0       0.0#012 Sum      1/0   12.11 MB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   3.9    135.8    159.8      0.51              0.24        11    0.046     21K   2261       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   3.9    136.3    160.4      0.50              0.24        10    0.050     21K   2261       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   0.0    176.1    154.4      0.39              0.18         5    0.078     21K   2261       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    180.8      0.11              0.05         5    0.023       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.020, interval 0.020#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.08 GB write, 0.13 MB/s write, 0.07 GB read, 0.11 MB/s read, 0.5 seconds#012Interval compaction: 0.08 GB write, 0.13 MB/s write, 0.07 GB read, 0.11 MB/s read, 0.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x56161a963350#2 capacity: 304.00 MB usage: 2.60 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 5.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(170,2.40 MB,0.789241%) FilterBlock(11,69.05 KB,0.0221805%) IndexBlock(11,132.45 KB,0.0425489%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Oct 10 05:55:19 np0005479823 podman[131529]: 2025-10-10 09:55:19.872145646 +0000 UTC m=+0.042888873 container create 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 10 05:55:19 np0005479823 podman[131529]: 2025-10-10 09:55:19.848861791 +0000 UTC m=+0.019605038 image pull 70c92fb64e1eda6ef063d34e60e9a541e44edbaa51e757e8304331202c76a3a7 quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857
Oct 10 05:55:19 np0005479823 python3[131397]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857
Oct 10 05:55:19 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:55:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:20 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:20 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:20 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:20 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:20 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a80027b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:20.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:55:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:21.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:55:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:21 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:21 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:21 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:55:21 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:55:21 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 05:55:21 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:55:21 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:55:21 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 05:55:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:22 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:22 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:22 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:22 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:22 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:22.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:23 np0005479823 python3.9[131876]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:55:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:23.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:23 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:23 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:23 np0005479823 python3.9[132030]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:55:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:24 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:24 np0005479823 python3.9[132106]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:55:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:24 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:24 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:24 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:24 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:24.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:24 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:55:24 np0005479823 python3.9[132258]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760090124.3119717-1705-252320942057628/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:55:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:25.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:25 np0005479823 python3.9[132335]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 10 05:55:25 np0005479823 systemd[1]: Reloading.
Oct 10 05:55:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:25 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:25 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:25 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:55:25 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:55:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:26 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:26 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:26 np0005479823 python3.9[132446]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:55:26 np0005479823 systemd[1]: Reloading.
Oct 10 05:55:26 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:55:26 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:55:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:26 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:26 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:26 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb990000d00 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:26 np0005479823 systemd[1]: Starting ovn_controller container...
Oct 10 05:55:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:26.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:26 np0005479823 systemd[1]: Started libcrun container.
Oct 10 05:55:26 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a1891d963d03ff5546418e96f4e624e3abb79d61408efcf4dc0a9f2f55e7ddc/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct 10 05:55:26 np0005479823 systemd[1]: Started /usr/bin/podman healthcheck run 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3.
Oct 10 05:55:26 np0005479823 podman[132488]: 2025-10-10 09:55:26.809649777 +0000 UTC m=+0.133595064 container init 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 05:55:26 np0005479823 ovn_controller[132503]: + sudo -E kolla_set_configs
Oct 10 05:55:26 np0005479823 podman[132488]: 2025-10-10 09:55:26.835147472 +0000 UTC m=+0.159092719 container start 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 05:55:26 np0005479823 edpm-start-podman-container[132488]: ovn_controller
Oct 10 05:55:26 np0005479823 systemd[1]: Created slice User Slice of UID 0.
Oct 10 05:55:26 np0005479823 systemd[1]: Starting User Runtime Directory /run/user/0...
Oct 10 05:55:26 np0005479823 systemd[1]: Finished User Runtime Directory /run/user/0.
Oct 10 05:55:26 np0005479823 systemd[1]: Starting User Manager for UID 0...
Oct 10 05:55:26 np0005479823 edpm-start-podman-container[132487]: Creating additional drop-in dependency for "ovn_controller" (470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3)
Oct 10 05:55:26 np0005479823 podman[132534]: 2025-10-10 09:55:26.913855199 +0000 UTC m=+0.065538357 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 10 05:55:26 np0005479823 systemd[1]: 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3-766bfbd5ad31caf8.service: Main process exited, code=exited, status=1/FAILURE
Oct 10 05:55:26 np0005479823 systemd[1]: 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3-766bfbd5ad31caf8.service: Failed with result 'exit-code'.
Oct 10 05:55:26 np0005479823 systemd[1]: Reloading.
Oct 10 05:55:27 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:55:27 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:55:27 np0005479823 systemd[132566]: Queued start job for default target Main User Target.
Oct 10 05:55:27 np0005479823 systemd[132566]: Created slice User Application Slice.
Oct 10 05:55:27 np0005479823 systemd[132566]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct 10 05:55:27 np0005479823 systemd[132566]: Started Daily Cleanup of User's Temporary Directories.
Oct 10 05:55:27 np0005479823 systemd[132566]: Reached target Paths.
Oct 10 05:55:27 np0005479823 systemd[132566]: Reached target Timers.
Oct 10 05:55:27 np0005479823 systemd[132566]: Starting D-Bus User Message Bus Socket...
Oct 10 05:55:27 np0005479823 systemd[132566]: Starting Create User's Volatile Files and Directories...
Oct 10 05:55:27 np0005479823 systemd[132566]: Finished Create User's Volatile Files and Directories.
Oct 10 05:55:27 np0005479823 systemd[132566]: Listening on D-Bus User Message Bus Socket.
Oct 10 05:55:27 np0005479823 systemd[132566]: Reached target Sockets.
Oct 10 05:55:27 np0005479823 systemd[132566]: Reached target Basic System.
Oct 10 05:55:27 np0005479823 systemd[132566]: Reached target Main User Target.
Oct 10 05:55:27 np0005479823 systemd[132566]: Startup finished in 130ms.
Oct 10 05:55:27 np0005479823 systemd[1]: Started User Manager for UID 0.
Oct 10 05:55:27 np0005479823 systemd[1]: Started ovn_controller container.
Oct 10 05:55:27 np0005479823 systemd[1]: Started Session c1 of User root.
Oct 10 05:55:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:27.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: INFO:__main__:Validating config file
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: INFO:__main__:Writing out command to execute
Oct 10 05:55:27 np0005479823 systemd[1]: session-c1.scope: Deactivated successfully.
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: ++ cat /run_command
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: + ARGS=
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: + sudo kolla_copy_cacerts
Oct 10 05:55:27 np0005479823 systemd[1]: Started Session c2 of User root.
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: + [[ ! -n '' ]]
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: + . kolla_extend_start
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: + umask 0022
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Oct 10 05:55:27 np0005479823 systemd[1]: session-c2.scope: Deactivated successfully.
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Oct 10 05:55:27 np0005479823 NetworkManager[44866]: <info>  [1760090127.3396] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Oct 10 05:55:27 np0005479823 NetworkManager[44866]: <info>  [1760090127.3405] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 05:55:27 np0005479823 NetworkManager[44866]: <info>  [1760090127.3418] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Oct 10 05:55:27 np0005479823 NetworkManager[44866]: <info>  [1760090127.3423] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Oct 10 05:55:27 np0005479823 NetworkManager[44866]: <info>  [1760090127.3427] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct 10 05:55:27 np0005479823 kernel: br-int: entered promiscuous mode
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00014|main|INFO|OVS feature set changed, force recompute.
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00018|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00019|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00020|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00021|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00022|main|INFO|OVS feature set changed, force recompute.
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 10 05:55:27 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:27Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 10 05:55:27 np0005479823 NetworkManager[44866]: <info>  [1760090127.3558] manager: (ovn-38ab03-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Oct 10 05:55:27 np0005479823 kernel: genev_sys_6081: entered promiscuous mode
Oct 10 05:55:27 np0005479823 NetworkManager[44866]: <info>  [1760090127.3712] device (genev_sys_6081): carrier: link connected
Oct 10 05:55:27 np0005479823 NetworkManager[44866]: <info>  [1760090127.3714] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/20)
Oct 10 05:55:27 np0005479823 systemd-udevd[132664]: Network interface NamePolicy= disabled on kernel command line.
Oct 10 05:55:27 np0005479823 systemd-udevd[132668]: Network interface NamePolicy= disabled on kernel command line.
Oct 10 05:55:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:27 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:27 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:27 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:55:27 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:55:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:28 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a8002930 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:28 np0005479823 NetworkManager[44866]: <info>  [1760090128.0331] manager: (ovn-a1a60c-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Oct 10 05:55:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:28 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:28 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:28 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:28 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:28 np0005479823 NetworkManager[44866]: <info>  [1760090128.5642] manager: (ovn-ee0899-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Oct 10 05:55:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:55:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:28.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:55:29 np0005479823 python3.9[132800]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:55:29 np0005479823 ovs-vsctl[132801]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Oct 10 05:55:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:29.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:29 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:29 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:29 np0005479823 python3.9[132953]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:55:29 np0005479823 ovs-vsctl[132955]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Oct 10 05:55:29 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:55:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:30 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb990001820 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:30 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a8003250 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:30 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:30 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:30 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:30.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:31 np0005479823 python3.9[133110]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:55:31 np0005479823 ovs-vsctl[133111]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Oct 10 05:55:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:31.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:31 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:31 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:31 np0005479823 systemd[1]: session-50.scope: Deactivated successfully.
Oct 10 05:55:31 np0005479823 systemd[1]: session-50.scope: Consumed 54.620s CPU time.
Oct 10 05:55:31 np0005479823 systemd-logind[796]: Session 50 logged out. Waiting for processes to exit.
Oct 10 05:55:31 np0005479823 systemd-logind[796]: Removed session 50.
Oct 10 05:55:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:32 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:32 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb990001820 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:32 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:32 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:32 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb990001820 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:55:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:32.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:55:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:55:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:33.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:55:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:33 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:33 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:34 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:34 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:34 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:34 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:34 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a8003250 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:34.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:34 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:55:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Oct 10 05:55:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:35.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:35 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:35 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:36 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:36 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:36 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:36 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:36 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a8003250 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:36 np0005479823 systemd-logind[796]: New session 52 of user zuul.
Oct 10 05:55:36 np0005479823 systemd[1]: Started Session 52 of User zuul.
Oct 10 05:55:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:36.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:37.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:37 np0005479823 systemd[1]: Stopping User Manager for UID 0...
Oct 10 05:55:37 np0005479823 systemd[132566]: Activating special unit Exit the Session...
Oct 10 05:55:37 np0005479823 systemd[132566]: Stopped target Main User Target.
Oct 10 05:55:37 np0005479823 systemd[132566]: Stopped target Basic System.
Oct 10 05:55:37 np0005479823 systemd[132566]: Stopped target Paths.
Oct 10 05:55:37 np0005479823 systemd[132566]: Stopped target Sockets.
Oct 10 05:55:37 np0005479823 systemd[132566]: Stopped target Timers.
Oct 10 05:55:37 np0005479823 systemd[132566]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 10 05:55:37 np0005479823 systemd[132566]: Closed D-Bus User Message Bus Socket.
Oct 10 05:55:37 np0005479823 systemd[132566]: Stopped Create User's Volatile Files and Directories.
Oct 10 05:55:37 np0005479823 systemd[132566]: Removed slice User Application Slice.
Oct 10 05:55:37 np0005479823 systemd[132566]: Reached target Shutdown.
Oct 10 05:55:37 np0005479823 systemd[132566]: Finished Exit the Session.
Oct 10 05:55:37 np0005479823 systemd[132566]: Reached target Exit the Session.
Oct 10 05:55:37 np0005479823 systemd[1]: user@0.service: Deactivated successfully.
Oct 10 05:55:37 np0005479823 systemd[1]: Stopped User Manager for UID 0.
Oct 10 05:55:37 np0005479823 systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct 10 05:55:37 np0005479823 systemd[1]: run-user-0.mount: Deactivated successfully.
Oct 10 05:55:37 np0005479823 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct 10 05:55:37 np0005479823 systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct 10 05:55:37 np0005479823 systemd[1]: Removed slice User Slice of UID 0.
Oct 10 05:55:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:37 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:37 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:37 np0005479823 python3.9[133322]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:55:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:38 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f50 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:38 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:38 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:38 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:38 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:55:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:38.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:55:39 np0005479823 python3.9[133480]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:55:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:39.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:39 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:39 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:39 np0005479823 python3.9[133632]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:55:39 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:55:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:40 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a8003250 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:40 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f50 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:40 np0005479823 python3.9[133784]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:55:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:40 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:40 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:40 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:40.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:41 np0005479823 python3.9[133938]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:55:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:41.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:41 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:41 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:41 np0005479823 python3.9[134090]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:55:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:42 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:42 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a8003250 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:42 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:42 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:42 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:55:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:42.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:55:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:43.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:43 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:43 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:43 np0005479823 python3.9[134242]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:55:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:44 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:44 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:44 np0005479823 python3.9[134395]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct 10 05:55:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:44 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:44 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:44 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb98c000b60 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:55:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:44.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:55:44 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:55:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:45.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:45 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:45 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:46 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:46 np0005479823 python3.9[134548]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:55:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:46 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:46 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:46 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:46 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:55:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:46.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:55:47 np0005479823 python3.9[134671]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090145.4117916-220-160631112383643/.source follow=False _original_basename=haproxy.j2 checksum=4bca74f6ee0b6450624d22997e2f90c414d58b44 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:55:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:47.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:47 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:47 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:47 np0005479823 python3.9[134821]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:55:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:48 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb98c0016a0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:48 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:48 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:48 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:48 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:48.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:48 np0005479823 python3.9[134942]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090147.3316293-265-124786216155015/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:55:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:49.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:49 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:49 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:49 np0005479823 python3.9[135096]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 05:55:49 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 05:55:49 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 5546 writes, 24K keys, 5546 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 5546 writes, 880 syncs, 6.30 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5546 writes, 24K keys, 5546 commit groups, 1.0 writes per commit group, ingest: 18.97 MB, 0.03 MB/s#012Interval WAL: 5546 writes, 880 syncs, 6.30 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_s
Oct 10 05:55:49 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:55:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:50 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:50 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb98c0016a0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:50 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:50 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:50 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:50 np0005479823 python3.9[135180]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 05:55:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:50.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:51.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:51 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:51 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:52 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:52 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:52 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:52 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:52 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb98c0016a0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:52.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:53 np0005479823 python3.9[135362]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 10 05:55:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:53.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:53 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:53 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:53 np0005479823 python3.9[135515]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:55:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:54 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:54 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:54 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:54 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:54 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:54 np0005479823 python3.9[135636]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090153.534199-376-74726853277112/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:55:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:54.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:54 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:55:55 np0005479823 python3.9[135788]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:55:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:55.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:55 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:55 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:55 np0005479823 python3.9[135909]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090154.6685617-376-124232097612305/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:55:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:56 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb98c002b10 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:56 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:56 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:56 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:56 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:56.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:57 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:57Z|00025|memory|INFO|16256 kB peak resident set size after 29.8 seconds
Oct 10 05:55:57 np0005479823 ovn_controller[132503]: 2025-10-10T09:55:57Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Oct 10 05:55:57 np0005479823 podman[136035]: 2025-10-10 09:55:57.174136243 +0000 UTC m=+0.087644527 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 10 05:55:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:55:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:57.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:55:57 np0005479823 python3.9[136070]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:55:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:57 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:57 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:57 np0005479823 python3.9[136208]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090156.8647535-509-52518455297501/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:55:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:58 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:58 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb98c002b10 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:58 np0005479823 python3.9[136358]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:55:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:58 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:58 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:55:58 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:55:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:55:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:55:58.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:55:58 np0005479823 python3.9[136480]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090157.8995488-509-235830547650296/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:55:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:55:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:55:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:55:59.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:55:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:55:59 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:55:59 2025: (VI_0) received an invalid passwd!
Oct 10 05:55:59 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:56:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:00 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:00 np0005479823 python3.9[136631]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:56:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:00 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:00 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:00 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:00 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:00.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:01 np0005479823 python3.9[136789]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:56:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:56:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:01.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:56:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:01 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:01 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:01 np0005479823 python3.9[136941]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:56:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095602 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 05:56:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:02 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:02 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998002070 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:02 np0005479823 python3.9[137019]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:56:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:02 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:02 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:02 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:56:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:02.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:56:03 np0005479823 python3.9[137173]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:56:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:03.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:03 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:03 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:03 np0005479823 python3.9[137251]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:56:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:04 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40034e0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:04 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:04 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:04 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:04 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998002070 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:04 np0005479823 python3.9[137403]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:56:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:04.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:04 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:56:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:05.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:05 np0005479823 python3.9[137557]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:56:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:05 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:05 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:05 np0005479823 python3.9[137635]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:56:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:06 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:06 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40041f0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:06 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:06 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:06 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:06 np0005479823 python3.9[137787]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:56:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:06.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:07 np0005479823 python3.9[137867]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:56:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:56:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:07.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:56:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:07 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:07 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:08 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998002070 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:08 np0005479823 python3.9[138019]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:56:08 np0005479823 systemd[1]: Reloading.
Oct 10 05:56:08 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:56:08 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:56:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:08 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:08 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:08 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:08 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40041f0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000065s ======
Oct 10 05:56:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:08.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000065s
Oct 10 05:56:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:09.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:09 np0005479823 python3.9[138211]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:56:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:09 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:09 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:09 np0005479823 python3.9[138289]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:56:09 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:56:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:10 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:10 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998002070 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:10 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:10 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:10 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 05:56:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:10 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a2b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:10 np0005479823 python3.9[138441]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:56:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:10.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:11 np0005479823 python3.9[138521]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:56:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:11.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:11 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:11 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:12 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40041f0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:12 np0005479823 python3.9[138673]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:56:12 np0005479823 systemd[1]: Reloading.
Oct 10 05:56:12 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:56:12 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:56:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:12 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:12 np0005479823 systemd[1]: Starting Create netns directory...
Oct 10 05:56:12 np0005479823 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 10 05:56:12 np0005479823 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 10 05:56:12 np0005479823 systemd[1]: Finished Create netns directory.
Oct 10 05:56:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:12 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:12 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:12 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb998002070 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:12.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:13.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:13 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:13 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:13 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 05:56:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:13 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 05:56:13 np0005479823 python3.9[138894]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:56:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:14 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a450 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:14 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40041f0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:14 np0005479823 python3.9[139046]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:56:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:14 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:14 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:14 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:14.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:14 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:56:14 np0005479823 python3.9[139170]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090173.8720157-961-170085316790550/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:56:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:15.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:15 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:15 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:16 np0005479823 python3.9[139323]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:56:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:16 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9980038c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:16 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a470 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:16 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:16 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:16 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40041f0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:16 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 05:56:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:56:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:16.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:56:16 np0005479823 python3.9[139476]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:56:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:17.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:17 np0005479823 python3.9[139600]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090176.413521-1036-128440861259022/.source.json _original_basename=._gvi0osk follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:56:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:17 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:17 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:18 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:18 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9980038c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:18 np0005479823 python3.9[139752]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:56:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:18 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:18 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:18 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a490 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:18.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:19.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:19 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:19 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:19 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:56:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:20 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40041f0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:20 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:20 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:20 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:20 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9980038c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:20.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:21 np0005479823 python3.9[140183]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Oct 10 05:56:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:21.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:21 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:21 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:22 np0005479823 python3.9[140335]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 10 05:56:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095622 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 05:56:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:22 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a4b0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:22 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40041f0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:22 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:22 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:22 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:56:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:22.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:56:23 np0005479823 python3.9[140489]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 10 05:56:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:23.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:23 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:23 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:24 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9980038c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:24 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a4d0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:24 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:24 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:24 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40041f0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:24.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:24 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:56:25 np0005479823 python3[140669]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 10 05:56:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:25.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:25 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:25 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:26 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:26 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9980038c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:26 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:26 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:26 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a4f0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:26.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:27.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:27 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:27 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:28 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40041f0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:28 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:28 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:28 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:28 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9980038c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:28.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:29.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:29 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:29 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:29 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:56:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:30 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a510 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:30 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b40041f0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:30 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:30 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:30 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:30.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:31.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:31 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:31 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:31 np0005479823 podman[140795]: 2025-10-10 09:56:31.630396947 +0000 UTC m=+3.897463011 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Oct 10 05:56:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:32 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9980038c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:32 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9c000a530 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:32 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:32 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:32 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a8001140 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:32.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:33.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:33 np0005479823 podman[140684]: 2025-10-10 09:56:33.468741492 +0000 UTC m=+8.268605852 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct 10 05:56:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:33 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:33 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:33 np0005479823 podman[140944]: 2025-10-10 09:56:33.653292332 +0000 UTC m=+0.077290252 container create 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 05:56:33 np0005479823 podman[140944]: 2025-10-10 09:56:33.602401922 +0000 UTC m=+0.026399882 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct 10 05:56:33 np0005479823 python3[140669]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct 10 05:56:34 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 05:56:34 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:56:34 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:56:34 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 05:56:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:34 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a8001140 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:34 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:34 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:34 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:34 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb98c000f30 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:56:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:34.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:56:34 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:56:35 np0005479823 python3.9[141136]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:56:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:35.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:35 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:35 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:36 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9980038c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:36 np0005479823 python3.9[141290]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:56:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:36 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a8001140 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:36 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:36 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:36 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:36 np0005479823 python3.9[141366]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 05:56:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:56:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:36.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:56:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:37.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:37 np0005479823 python3.9[141519]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760090196.7515676-1300-247127145947468/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:56:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:37 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:37 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:38 np0005479823 python3.9[141595]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 10 05:56:38 np0005479823 systemd[1]: Reloading.
Oct 10 05:56:38 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:56:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:38 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb98c000f30 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:38 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:56:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:38 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9980038c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:38 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:38 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:38 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a80023c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:38.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:38 np0005479823 python3.9[141731]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:56:38 np0005479823 systemd[1]: Reloading.
Oct 10 05:56:39 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:56:39 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:56:39 np0005479823 systemd[1]: Starting ovn_metadata_agent container...
Oct 10 05:56:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:56:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:39.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:56:39 np0005479823 systemd[1]: Started libcrun container.
Oct 10 05:56:39 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2325f8d451b5d0b2b4e3183f8e0614a7d17eb52f78e5487ff9f04d9d9849509f/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Oct 10 05:56:39 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2325f8d451b5d0b2b4e3183f8e0614a7d17eb52f78e5487ff9f04d9d9849509f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 10 05:56:39 np0005479823 systemd[1]: Started /usr/bin/podman healthcheck run 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d.
Oct 10 05:56:39 np0005479823 podman[141774]: 2025-10-10 09:56:39.443797781 +0000 UTC m=+0.152845676 container init 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible)
Oct 10 05:56:39 np0005479823 ovn_metadata_agent[141790]: + sudo -E kolla_set_configs
Oct 10 05:56:39 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:56:39 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:56:39 np0005479823 podman[141774]: 2025-10-10 09:56:39.478197052 +0000 UTC m=+0.187244917 container start 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct 10 05:56:39 np0005479823 edpm-start-podman-container[141774]: ovn_metadata_agent
Oct 10 05:56:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:39 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:39 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:39 np0005479823 ovn_metadata_agent[141790]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 10 05:56:39 np0005479823 podman[141797]: 2025-10-10 09:56:39.535779867 +0000 UTC m=+0.048701124 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct 10 05:56:39 np0005479823 ovn_metadata_agent[141790]: INFO:__main__:Validating config file
Oct 10 05:56:39 np0005479823 ovn_metadata_agent[141790]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 10 05:56:39 np0005479823 ovn_metadata_agent[141790]: INFO:__main__:Copying service configuration files
Oct 10 05:56:39 np0005479823 ovn_metadata_agent[141790]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Oct 10 05:56:39 np0005479823 ovn_metadata_agent[141790]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Oct 10 05:56:39 np0005479823 ovn_metadata_agent[141790]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Oct 10 05:56:39 np0005479823 ovn_metadata_agent[141790]: INFO:__main__:Writing out command to execute
Oct 10 05:56:39 np0005479823 ovn_metadata_agent[141790]: INFO:__main__:Setting permission for /var/lib/neutron
Oct 10 05:56:39 np0005479823 ovn_metadata_agent[141790]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Oct 10 05:56:39 np0005479823 ovn_metadata_agent[141790]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Oct 10 05:56:39 np0005479823 ovn_metadata_agent[141790]: INFO:__main__:Setting permission for /var/lib/neutron/external
Oct 10 05:56:39 np0005479823 ovn_metadata_agent[141790]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Oct 10 05:56:39 np0005479823 ovn_metadata_agent[141790]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Oct 10 05:56:39 np0005479823 ovn_metadata_agent[141790]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Oct 10 05:56:39 np0005479823 edpm-start-podman-container[141773]: Creating additional drop-in dependency for "ovn_metadata_agent" (2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d)
Oct 10 05:56:39 np0005479823 ovn_metadata_agent[141790]: ++ cat /run_command
Oct 10 05:56:39 np0005479823 ovn_metadata_agent[141790]: + CMD=neutron-ovn-metadata-agent
Oct 10 05:56:39 np0005479823 ovn_metadata_agent[141790]: + ARGS=
Oct 10 05:56:39 np0005479823 ovn_metadata_agent[141790]: + sudo kolla_copy_cacerts
Oct 10 05:56:39 np0005479823 systemd[1]: Reloading.
Oct 10 05:56:39 np0005479823 ovn_metadata_agent[141790]: + [[ ! -n '' ]]
Oct 10 05:56:39 np0005479823 ovn_metadata_agent[141790]: + . kolla_extend_start
Oct 10 05:56:39 np0005479823 ovn_metadata_agent[141790]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Oct 10 05:56:39 np0005479823 ovn_metadata_agent[141790]: Running command: 'neutron-ovn-metadata-agent'
Oct 10 05:56:39 np0005479823 ovn_metadata_agent[141790]: + umask 0022
Oct 10 05:56:39 np0005479823 ovn_metadata_agent[141790]: + exec neutron-ovn-metadata-agent
Oct 10 05:56:39 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:56:39 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:56:39 np0005479823 systemd[1]: Started ovn_metadata_agent container.
Oct 10 05:56:39 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:56:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:40 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:40 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb98c001f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:40 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:40 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:40 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9980038c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:40.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:41 np0005479823 systemd[1]: session-52.scope: Deactivated successfully.
Oct 10 05:56:41 np0005479823 systemd[1]: session-52.scope: Consumed 53.759s CPU time.
Oct 10 05:56:41 np0005479823 systemd-logind[796]: Session 52 logged out. Waiting for processes to exit.
Oct 10 05:56:41 np0005479823 systemd-logind[796]: Removed session 52.
Oct 10 05:56:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:41.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.389 141795 INFO neutron.common.config [-] Logging enabled!#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.390 141795 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.390 141795 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.390 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.390 141795 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.391 141795 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.391 141795 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.391 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.391 141795 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.391 141795 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.391 141795 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.392 141795 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.392 141795 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.392 141795 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.392 141795 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.392 141795 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.392 141795 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.393 141795 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.393 141795 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.393 141795 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.393 141795 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.393 141795 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.393 141795 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.393 141795 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.394 141795 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.394 141795 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.394 141795 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.394 141795 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.394 141795 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.394 141795 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.394 141795 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.394 141795 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.395 141795 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.395 141795 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.395 141795 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.395 141795 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.395 141795 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.395 141795 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.396 141795 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.396 141795 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.396 141795 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.396 141795 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.396 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.396 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.396 141795 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.396 141795 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.397 141795 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.397 141795 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.397 141795 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.397 141795 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.397 141795 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.397 141795 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.397 141795 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.397 141795 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.398 141795 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.398 141795 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.398 141795 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.398 141795 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.398 141795 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.399 141795 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.399 141795 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.399 141795 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.399 141795 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.399 141795 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.399 141795 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.399 141795 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.400 141795 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.400 141795 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.400 141795 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.400 141795 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.400 141795 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.400 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.400 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.401 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.401 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.401 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.401 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.401 141795 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.401 141795 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.401 141795 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.402 141795 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.402 141795 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.402 141795 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.402 141795 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.402 141795 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.402 141795 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.402 141795 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.403 141795 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.403 141795 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.403 141795 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.403 141795 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.403 141795 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.403 141795 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.403 141795 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.403 141795 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.404 141795 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.404 141795 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.404 141795 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.404 141795 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.404 141795 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.404 141795 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.404 141795 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.404 141795 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.405 141795 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.405 141795 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.405 141795 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.405 141795 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.405 141795 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.405 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.405 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.406 141795 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.406 141795 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.406 141795 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.406 141795 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.406 141795 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.406 141795 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.406 141795 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.407 141795 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.407 141795 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.407 141795 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.407 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.407 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.407 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.407 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.408 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.408 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.408 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.408 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.408 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.408 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.408 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.409 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.409 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.409 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.409 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.409 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.409 141795 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.409 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.410 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.410 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.410 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.410 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.410 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.410 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.410 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.411 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.411 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.411 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.411 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.411 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.411 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.411 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.412 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.412 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.412 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.412 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.412 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.412 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.412 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.413 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.413 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.413 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.413 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.413 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.413 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.413 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.414 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.414 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.414 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.414 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.414 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.414 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.414 141795 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.415 141795 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.415 141795 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.415 141795 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.415 141795 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.415 141795 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.415 141795 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.415 141795 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.416 141795 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.416 141795 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.416 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.416 141795 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.416 141795 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.416 141795 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.416 141795 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.417 141795 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.417 141795 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.417 141795 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.417 141795 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.417 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.417 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.417 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.418 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.418 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.418 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.418 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.418 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.418 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.418 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.419 141795 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.419 141795 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.419 141795 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.419 141795 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.419 141795 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.419 141795 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.419 141795 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.420 141795 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.420 141795 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.420 141795 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.420 141795 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.420 141795 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.420 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.420 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.421 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.421 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.421 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.421 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.421 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.421 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.421 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.422 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.422 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.422 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.422 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.422 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.422 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.422 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.423 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.423 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.423 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.423 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.423 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.423 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.423 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.424 141795 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.424 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.424 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.424 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.424 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.424 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.424 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.425 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.425 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.425 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.425 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.425 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.425 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.425 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.426 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.426 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.426 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.426 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.426 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.426 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.426 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.427 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.427 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.427 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.427 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.427 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.427 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.427 141795 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.428 141795 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.428 141795 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.428 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.428 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.428 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.428 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.428 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.429 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.429 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.429 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.429 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.429 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.429 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.429 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.430 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.430 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.430 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.430 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.430 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.430 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.431 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.431 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.431 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.431 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.431 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.431 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.431 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.432 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.432 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.432 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.432 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.432 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.432 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.433 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.433 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.433 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.433 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.433 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.433 141795 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.433 141795 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.442 141795 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.442 141795 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.442 141795 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.442 141795 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.443 141795 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.455 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 49146ebb-575d-4bd4-816c-0b242fb944ee (UUID: 49146ebb-575d-4bd4-816c-0b242fb944ee) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.478 141795 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.478 141795 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.478 141795 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.479 141795 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.484 141795 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.492 141795 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.498 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '49146ebb-575d-4bd4-816c-0b242fb944ee'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fa4a23cd910>], external_ids={}, name=49146ebb-575d-4bd4-816c-0b242fb944ee, nb_cfg_timestamp=1760090135358, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.500 141795 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7fa4a23bdf70>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Oct 10 05:56:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:41 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:41 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.501 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.501 141795 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.501 141795 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.501 141795 INFO oslo_service.service [-] Starting 1 workers#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.506 141795 DEBUG oslo_service.service [-] Started child 141903 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.509 141903 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-361953'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.510 141795 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpsqkyazod/privsep.sock']#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.528 141903 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.529 141903 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.529 141903 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.532 141903 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.538 141903 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Oct 10 05:56:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:41.543 141903 INFO eventlet.wsgi.server [-] (141903) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Oct 10 05:56:42 np0005479823 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Oct 10 05:56:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:42 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9a80023c0 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:42 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:42.175 141795 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Oct 10 05:56:42 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:42.176 141795 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpsqkyazod/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Oct 10 05:56:42 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:42.046 141908 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct 10 05:56:42 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:42.052 141908 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct 10 05:56:42 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:42.055 141908 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Oct 10 05:56:42 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:42.055 141908 INFO oslo.privsep.daemon [-] privsep daemon running as pid 141908#033[00m
Oct 10 05:56:42 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:42.179 141908 DEBUG oslo.privsep.daemon [-] privsep: reply[01e9a910-34c8-48ab-845c-f8e4b2b45d8a]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 05:56:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:42 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb99c003f70 fd 41 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:56:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:42 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:42 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[122575]: 10/10/2025 09:56:42 : epoch 68e8d7ce : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb98c001f70 fd 41 proxy ignored for local
Oct 10 05:56:42 np0005479823 kernel: ganesha.nfsd[140891]: segfault at 50 ip 00007fba6c89632e sp 00007fba297f9210 error 4 in libntirpc.so.5.8[7fba6c87b000+2c000] likely on CPU 2 (core 0, socket 2)
Oct 10 05:56:42 np0005479823 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Oct 10 05:56:42 np0005479823 systemd[1]: Started Process Core Dump (PID 141914/UID 0).
Oct 10 05:56:42 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:42.679 141908 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 05:56:42 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:42.679 141908 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 05:56:42 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:42.680 141908 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 05:56:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:42.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.213 141908 DEBUG oslo.privsep.daemon [-] privsep: reply[f5026b2d-e834-4424-a4d3-e3d6c094a10a]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.215 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, column=external_ids, values=({'neutron:ovn-metadata-id': 'cc7418c8-610c-5a79-bc13-35d330f4cf3b'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.223 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.229 141795 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.229 141795 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.229 141795 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.229 141795 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.229 141795 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.229 141795 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.230 141795 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.230 141795 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.230 141795 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.230 141795 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.230 141795 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.230 141795 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.230 141795 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.230 141795 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.231 141795 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.231 141795 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.231 141795 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.231 141795 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.231 141795 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.231 141795 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.231 141795 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.231 141795 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.232 141795 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.232 141795 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.232 141795 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.232 141795 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.232 141795 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.233 141795 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.233 141795 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.233 141795 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.233 141795 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.233 141795 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.233 141795 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.233 141795 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.234 141795 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.234 141795 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.234 141795 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.234 141795 DEBUG oslo_service.service [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.234 141795 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.234 141795 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.234 141795 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.234 141795 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.235 141795 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.235 141795 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.235 141795 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.235 141795 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.235 141795 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.235 141795 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.235 141795 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.235 141795 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.235 141795 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.235 141795 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.235 141795 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.236 141795 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.236 141795 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.236 141795 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.236 141795 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.236 141795 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.236 141795 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.236 141795 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.236 141795 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.236 141795 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.236 141795 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.237 141795 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.237 141795 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.237 141795 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.237 141795 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.237 141795 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.237 141795 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.237 141795 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.237 141795 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.237 141795 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.238 141795 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.238 141795 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.238 141795 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.238 141795 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.238 141795 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.238 141795 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.238 141795 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.239 141795 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.239 141795 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.239 141795 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.239 141795 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.239 141795 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.239 141795 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.239 141795 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.239 141795 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.239 141795 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.240 141795 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.240 141795 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.240 141795 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.240 141795 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.240 141795 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.240 141795 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.240 141795 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.240 141795 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.240 141795 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.240 141795 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.241 141795 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.241 141795 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.241 141795 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.241 141795 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.241 141795 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.241 141795 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.241 141795 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.241 141795 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.241 141795 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.241 141795 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.242 141795 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.242 141795 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.242 141795 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.242 141795 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.242 141795 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.242 141795 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.242 141795 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.242 141795 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.242 141795 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.243 141795 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.243 141795 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.243 141795 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.243 141795 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.243 141795 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.243 141795 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.243 141795 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.243 141795 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.244 141795 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.244 141795 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.244 141795 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.244 141795 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.244 141795 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.244 141795 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.244 141795 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.244 141795 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.244 141795 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.245 141795 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.245 141795 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.245 141795 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.245 141795 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.245 141795 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.245 141795 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.245 141795 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.245 141795 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.245 141795 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.246 141795 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.246 141795 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.246 141795 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.246 141795 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.246 141795 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.246 141795 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.246 141795 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.246 141795 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.246 141795 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.246 141795 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.247 141795 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.247 141795 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.247 141795 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.247 141795 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.247 141795 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.247 141795 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.247 141795 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.247 141795 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.247 141795 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.248 141795 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.248 141795 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.248 141795 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.248 141795 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.248 141795 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.248 141795 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.248 141795 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.248 141795 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.248 141795 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.248 141795 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.249 141795 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.249 141795 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.249 141795 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.249 141795 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.249 141795 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.249 141795 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.249 141795 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.249 141795 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.249 141795 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.250 141795 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.250 141795 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.250 141795 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.250 141795 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.250 141795 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.250 141795 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.250 141795 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.250 141795 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.251 141795 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.251 141795 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.251 141795 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.251 141795 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.251 141795 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.251 141795 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.251 141795 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.251 141795 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.251 141795 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.252 141795 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.252 141795 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.252 141795 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.252 141795 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.252 141795 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.252 141795 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.252 141795 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.252 141795 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.253 141795 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.253 141795 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.253 141795 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.253 141795 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.253 141795 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.253 141795 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.253 141795 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.253 141795 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.253 141795 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.253 141795 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.254 141795 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.254 141795 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.254 141795 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.254 141795 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.254 141795 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.254 141795 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.254 141795 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.254 141795 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.254 141795 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.254 141795 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.255 141795 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.255 141795 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.255 141795 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.255 141795 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.255 141795 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.255 141795 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.255 141795 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.255 141795 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.255 141795 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.255 141795 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.255 141795 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.256 141795 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.256 141795 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.256 141795 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.256 141795 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.256 141795 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.256 141795 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.256 141795 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.256 141795 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.256 141795 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.257 141795 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.257 141795 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.257 141795 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.257 141795 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.257 141795 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.257 141795 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.257 141795 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.257 141795 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.257 141795 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.257 141795 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.258 141795 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.258 141795 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.258 141795 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.258 141795 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.258 141795 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.258 141795 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.258 141795 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.258 141795 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.258 141795 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.258 141795 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.259 141795 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.259 141795 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.259 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.259 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.259 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.259 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.259 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.259 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.259 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.260 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.260 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.260 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.260 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.260 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.260 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.260 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.260 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.260 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.260 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.261 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.261 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.261 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.261 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.261 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.261 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.261 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.261 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.261 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.262 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.262 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.262 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.262 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.262 141795 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.262 141795 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.262 141795 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.262 141795 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.262 141795 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 05:56:43 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:56:43.262 141795 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct 10 05:56:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:43.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:43 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:43 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:44 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:44 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:44.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:44 np0005479823 systemd-coredump[141915]: Process 122579 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 63:#012#0  0x00007fba6c89632e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Oct 10 05:56:44 np0005479823 systemd[1]: systemd-coredump@3-141914-0.service: Deactivated successfully.
Oct 10 05:56:44 np0005479823 systemd[1]: systemd-coredump@3-141914-0.service: Consumed 1.186s CPU time.
Oct 10 05:56:44 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:56:44 np0005479823 podman[141923]: 2025-10-10 09:56:44.961763172 +0000 UTC m=+0.025568251 container died 42b56e31a57061dccff8e3670fdf444d91a3efcdd731ccdfa0e72b9ab7909387 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 05:56:44 np0005479823 systemd[1]: var-lib-containers-storage-overlay-e64abc64c69623192b44a062c89724fdf3d77809147a47565255988d23e459a8-merged.mount: Deactivated successfully.
Oct 10 05:56:45 np0005479823 podman[141923]: 2025-10-10 09:56:45.004555239 +0000 UTC m=+0.068360308 container remove 42b56e31a57061dccff8e3670fdf444d91a3efcdd731ccdfa0e72b9ab7909387 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Oct 10 05:56:45 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Main process exited, code=exited, status=139/n/a
Oct 10 05:56:45 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Failed with result 'exit-code'.
Oct 10 05:56:45 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.690s CPU time.
Oct 10 05:56:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:56:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:45.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:56:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:45 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:45 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:46 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:46 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:46 np0005479823 systemd-logind[796]: New session 53 of user zuul.
Oct 10 05:56:46 np0005479823 systemd[1]: Started Session 53 of User zuul.
Oct 10 05:56:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:46.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:47.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:47 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:47 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:47 np0005479823 python3.9[142122]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 05:56:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:48 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:48 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:48.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:48 np0005479823 python3.9[142279]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:56:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:49.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:49 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:49 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:49 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:56:50 np0005479823 python3.9[142445]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 10 05:56:50 np0005479823 systemd[1]: Reloading.
Oct 10 05:56:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:50 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:50 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:50 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:56:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095650 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 05:56:50 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:56:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:56:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:50.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:56:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:56:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:51.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:56:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:51 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:51 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:51 np0005479823 python3.9[142632]: ansible-ansible.builtin.service_facts Invoked
Oct 10 05:56:51 np0005479823 network[142649]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 10 05:56:51 np0005479823 network[142650]: 'network-scripts' will be removed from distribution in near future.
Oct 10 05:56:51 np0005479823 network[142651]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 10 05:56:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:52 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:52 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:56:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:52.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:56:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:56:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:53.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:56:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:53 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:53 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:54 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:54 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:54.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:54 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:56:55 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Scheduled restart job, restart counter is at 4.
Oct 10 05:56:55 np0005479823 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 05:56:55 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.690s CPU time.
Oct 10 05:56:55 np0005479823 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 05:56:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:55.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:55 np0005479823 podman[142832]: 2025-10-10 09:56:55.390370723 +0000 UTC m=+0.047161976 container create b23f4245aa0739ff2aefd746e5c9116bceb56d25c8921ab2597aff854d15760d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 05:56:55 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1baeee591cf25b2c556109a3adde6faf1bce3eb6b30f02f9b94da05ddab8c8f6/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Oct 10 05:56:55 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1baeee591cf25b2c556109a3adde6faf1bce3eb6b30f02f9b94da05ddab8c8f6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 05:56:55 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1baeee591cf25b2c556109a3adde6faf1bce3eb6b30f02f9b94da05ddab8c8f6/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 05:56:55 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1baeee591cf25b2c556109a3adde6faf1bce3eb6b30f02f9b94da05ddab8c8f6/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.boccfy-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 05:56:55 np0005479823 podman[142832]: 2025-10-10 09:56:55.443795087 +0000 UTC m=+0.100586350 container init b23f4245aa0739ff2aefd746e5c9116bceb56d25c8921ab2597aff854d15760d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Oct 10 05:56:55 np0005479823 podman[142832]: 2025-10-10 09:56:55.450138988 +0000 UTC m=+0.106930241 container start b23f4245aa0739ff2aefd746e5c9116bceb56d25c8921ab2597aff854d15760d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325)
Oct 10 05:56:55 np0005479823 bash[142832]: b23f4245aa0739ff2aefd746e5c9116bceb56d25c8921ab2597aff854d15760d
Oct 10 05:56:55 np0005479823 podman[142832]: 2025-10-10 09:56:55.369095329 +0000 UTC m=+0.025886582 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:56:55 np0005479823 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 05:56:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:56:55 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Oct 10 05:56:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:56:55 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Oct 10 05:56:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:55 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:55 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:56:55 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Oct 10 05:56:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:56:55 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Oct 10 05:56:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:56:55 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Oct 10 05:56:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:56:55 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Oct 10 05:56:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:56:55 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Oct 10 05:56:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:56:55 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 05:56:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:56 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:56 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:56 np0005479823 python3.9[143048]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:56:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:56.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:57.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:57 np0005479823 python3.9[143203]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:56:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:57 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:57 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:58 np0005479823 python3.9[143356]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:56:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:58 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:58 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:56:58.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:56:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:56:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:56:59.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:56:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:56:59 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:56:59 2025: (VI_0) received an invalid passwd!
Oct 10 05:56:59 np0005479823 python3.9[143511]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:56:59 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:57:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:00 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:00 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:00 np0005479823 python3.9[143664]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:57:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:57:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:00.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:57:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:01.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:01 np0005479823 python3.9[143819]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:57:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:01 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:01 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:01 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 05:57:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:01 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 05:57:02 np0005479823 python3.9[143972]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 05:57:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:02 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:02 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:02 np0005479823 podman[144021]: 2025-10-10 09:57:02.821014861 +0000 UTC m=+0.096334916 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 10 05:57:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:57:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:02.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:57:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:57:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:03.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:57:03 np0005479823 python3.9[144156]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:57:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:03 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:03 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:03 np0005479823 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Oct 10 05:57:03 np0005479823 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Oct 10 05:57:03 np0005479823 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Oct 10 05:57:03 np0005479823 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Oct 10 05:57:03 np0005479823 python3.9[144308]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:57:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:04 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:04 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:04 np0005479823 python3.9[144460]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:57:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:04.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:04 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:57:05 np0005479823 python3.9[144614]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:57:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:05.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:05 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:05 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:05 np0005479823 python3.9[144766]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:57:06 np0005479823 python3.9[144918]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:57:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:06 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:06 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:57:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:06.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:57:07 np0005479823 python3.9[145072]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:57:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:07.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:07 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:07 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 05:57:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Oct 10 05:57:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Oct 10 05:57:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Oct 10 05:57:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Oct 10 05:57:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Oct 10 05:57:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Oct 10 05:57:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 05:57:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 05:57:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 05:57:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Oct 10 05:57:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 05:57:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Oct 10 05:57:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Oct 10 05:57:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Oct 10 05:57:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Oct 10 05:57:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Oct 10 05:57:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Oct 10 05:57:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Oct 10 05:57:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Oct 10 05:57:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Oct 10 05:57:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Oct 10 05:57:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Oct 10 05:57:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Oct 10 05:57:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 05:57:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Oct 10 05:57:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:07 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 05:57:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:08 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce44000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:08 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce38001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:08 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:08 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:08 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce20000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:57:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:08.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:57:08 np0005479823 python3.9[145240]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:57:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:09.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:09 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:09 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:09 np0005479823 python3.9[145393]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:57:09 np0005479823 podman[145402]: 2025-10-10 09:57:09.797009615 +0000 UTC m=+0.063596827 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct 10 05:57:09 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:57:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:10 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce1c000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:10 np0005479823 python3.9[145565]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:57:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:10 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce28000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:10 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:10 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095710 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 05:57:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:10 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce38001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:10.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:10 np0005479823 python3.9[145718]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:57:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:11.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:11 np0005479823 python3.9[145871]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:57:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:11 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:11 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:12 np0005479823 python3.9[146023]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:57:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:12 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce200016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:12 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce1c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:12 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:12 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:12 np0005479823 kernel: ganesha.nfsd[145102]: segfault at 50 ip 00007fcef39fe32e sp 00007fcec0ff8210 error 4 in libntirpc.so.5.8[7fcef39e3000+2c000] likely on CPU 1 (core 0, socket 1)
Oct 10 05:57:12 np0005479823 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Oct 10 05:57:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[142851]: 10/10/2025 09:57:12 : epoch 68e8d867 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fce28001ac0 fd 39 proxy ignored for local
Oct 10 05:57:12 np0005479823 systemd[1]: Started Process Core Dump (PID 146177/UID 0).
Oct 10 05:57:12 np0005479823 python3.9[146175]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:57:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:12.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:13.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:13 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:13 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:13 np0005479823 systemd-coredump[146178]: Process 142857 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 45:#012#0  0x00007fcef39fe32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Oct 10 05:57:13 np0005479823 systemd[1]: systemd-coredump@4-146177-0.service: Deactivated successfully.
Oct 10 05:57:13 np0005479823 systemd[1]: systemd-coredump@4-146177-0.service: Consumed 1.162s CPU time.
Oct 10 05:57:13 np0005479823 podman[146233]: 2025-10-10 09:57:13.868959555 +0000 UTC m=+0.023038791 container died b23f4245aa0739ff2aefd746e5c9116bceb56d25c8921ab2597aff854d15760d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325)
Oct 10 05:57:13 np0005479823 systemd[1]: var-lib-containers-storage-overlay-1baeee591cf25b2c556109a3adde6faf1bce3eb6b30f02f9b94da05ddab8c8f6-merged.mount: Deactivated successfully.
Oct 10 05:57:13 np0005479823 podman[146233]: 2025-10-10 09:57:13.917133392 +0000 UTC m=+0.071212628 container remove b23f4245aa0739ff2aefd746e5c9116bceb56d25c8921ab2597aff854d15760d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True, org.label-schema.schema-version=1.0)
Oct 10 05:57:13 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Main process exited, code=exited, status=139/n/a
Oct 10 05:57:14 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Failed with result 'exit-code'.
Oct 10 05:57:14 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.353s CPU time.
Oct 10 05:57:14 np0005479823 python3.9[146403]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:57:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:14 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:14 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:14.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:14 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:57:15 np0005479823 python3.9[146557]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 10 05:57:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:15.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:15 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:15 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:16 np0005479823 python3.9[146709]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 10 05:57:16 np0005479823 systemd[1]: Reloading.
Oct 10 05:57:16 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:57:16 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:57:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:16 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:16 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:16.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:17.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:17 np0005479823 python3.9[146898]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:57:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:17 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:17 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:18 np0005479823 python3.9[147051]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:57:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:18 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:18 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095718 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 05:57:18 np0005479823 python3.9[147205]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:57:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:18.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:19.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:19 np0005479823 python3.9[147359]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:57:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:19 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:19 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:19 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:57:20 np0005479823 python3.9[147512]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:57:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:20 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:20 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:20 np0005479823 python3.9[147665]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:57:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:20.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:21 np0005479823 python3.9[147820]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 05:57:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:21.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:21 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:21 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:22 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:22 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:22.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:23.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:23 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:23 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:23 np0005479823 python3.9[147975]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Oct 10 05:57:24 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Scheduled restart job, restart counter is at 5.
Oct 10 05:57:24 np0005479823 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 05:57:24 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.353s CPU time.
Oct 10 05:57:24 np0005479823 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 05:57:24 np0005479823 podman[148170]: 2025-10-10 09:57:24.449139259 +0000 UTC m=+0.043483589 container create d581a237d23fa331a7186f793a936d9a26d2be9b5cfdac77842068423ea84792 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 05:57:24 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03a086b1f9a52c382b0bf0c9603711827ef5e521aa04ce6dd516e78cd0a1e7bd/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Oct 10 05:57:24 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03a086b1f9a52c382b0bf0c9603711827ef5e521aa04ce6dd516e78cd0a1e7bd/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 05:57:24 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03a086b1f9a52c382b0bf0c9603711827ef5e521aa04ce6dd516e78cd0a1e7bd/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 05:57:24 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03a086b1f9a52c382b0bf0c9603711827ef5e521aa04ce6dd516e78cd0a1e7bd/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.boccfy-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 05:57:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:24 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:24 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:24 np0005479823 podman[148170]: 2025-10-10 09:57:24.519465129 +0000 UTC m=+0.113809479 container init d581a237d23fa331a7186f793a936d9a26d2be9b5cfdac77842068423ea84792 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 05:57:24 np0005479823 podman[148170]: 2025-10-10 09:57:24.428417653 +0000 UTC m=+0.022762003 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 05:57:24 np0005479823 podman[148170]: 2025-10-10 09:57:24.524486698 +0000 UTC m=+0.118831028 container start d581a237d23fa331a7186f793a936d9a26d2be9b5cfdac77842068423ea84792 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, OSD_FLAVOR=default)
Oct 10 05:57:24 np0005479823 bash[148170]: d581a237d23fa331a7186f793a936d9a26d2be9b5cfdac77842068423ea84792
Oct 10 05:57:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:24 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Oct 10 05:57:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:24 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Oct 10 05:57:24 np0005479823 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 05:57:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:24 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Oct 10 05:57:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:24 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Oct 10 05:57:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:24 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Oct 10 05:57:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:24 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Oct 10 05:57:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:24 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Oct 10 05:57:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:24 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 05:57:24 np0005479823 python3.9[148171]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 10 05:57:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:57:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:24.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:57:24 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:57:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:25.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:25 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:25 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:25 np0005479823 python3.9[148387]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 10 05:57:25 np0005479823 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 05:57:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:26 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:26 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:26.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:26 np0005479823 python3.9[148549]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 05:57:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:57:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:27.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:57:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:27 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:27 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:27 np0005479823 python3.9[148634]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 05:57:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:28 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:28 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:28.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:29.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:29 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:29 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:29 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:57:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:30 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:30 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:30 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 05:57:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:30 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 05:57:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:57:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:30.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:57:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:31.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:31 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:31 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:32 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:32 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:57:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:32.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:57:33 np0005479823 podman[148676]: 2025-10-10 09:57:33.198129411 +0000 UTC m=+0.137700196 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 10 05:57:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:33.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:33 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:33 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:34 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:34 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000064s ======
Oct 10 05:57:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:34.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000064s
Oct 10 05:57:34 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:57:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:57:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:35.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:57:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:35 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:35 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:36 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:36 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 05:57:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Oct 10 05:57:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Oct 10 05:57:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Oct 10 05:57:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Oct 10 05:57:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Oct 10 05:57:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Oct 10 05:57:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 05:57:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 05:57:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 05:57:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Oct 10 05:57:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 05:57:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Oct 10 05:57:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Oct 10 05:57:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Oct 10 05:57:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Oct 10 05:57:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Oct 10 05:57:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Oct 10 05:57:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Oct 10 05:57:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Oct 10 05:57:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Oct 10 05:57:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Oct 10 05:57:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Oct 10 05:57:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Oct 10 05:57:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 05:57:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Oct 10 05:57:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 05:57:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:57:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:36.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:57:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:37.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:37 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:37 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:38 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:38 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:38 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:38 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:38 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:38.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:57:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:39.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:57:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:39 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:39 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:39 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 05:57:39 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:57:39 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:57:39 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 05:57:39 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:57:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:40 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:40 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:40 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:40 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/095740 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 05:57:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:40 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c002070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:40 np0005479823 podman[148982]: 2025-10-10 09:57:40.788044933 +0000 UTC m=+0.060985630 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct 10 05:57:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:40.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:57:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:41.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:57:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:57:41.444 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 05:57:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:57:41.445 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 05:57:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:57:41.445 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 05:57:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:41 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:41 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:42 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:42 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9080025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:42 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:42 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:42 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:42.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:43.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:43 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:43 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:44 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c002070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:44 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:44 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:44 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:44 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9080025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:44.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:44 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:57:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:57:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:45.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:57:45 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:57:45 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:57:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:45 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:45 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:46 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:46 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c002070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:46 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:46 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:46 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c002070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:46.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:47.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:47 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:47 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:48 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:48 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9080032d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:48 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:48 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:48 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:48.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:49.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:49 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:49 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:49 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:57:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:50 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c009990 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:50 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:50 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:50 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:50 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9080032d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:50.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:57:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:51.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:57:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:51 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:51 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:52 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:52 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c009990 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:52 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:52 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:52 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:57:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:52.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:57:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:53.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:53 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:53 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:54 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:54 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:54 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:54 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:54 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c009990 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:54.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:54 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:57:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:55.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:55 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:55 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:55 np0005479823 kernel: SELinux:  Converting 2770 SID table entries...
Oct 10 05:57:55 np0005479823 kernel: SELinux:  policy capability network_peer_controls=1
Oct 10 05:57:55 np0005479823 kernel: SELinux:  policy capability open_perms=1
Oct 10 05:57:55 np0005479823 kernel: SELinux:  policy capability extended_socket_class=1
Oct 10 05:57:55 np0005479823 kernel: SELinux:  policy capability always_check_network=0
Oct 10 05:57:55 np0005479823 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 10 05:57:55 np0005479823 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 10 05:57:55 np0005479823 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 10 05:57:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:56 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:56 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:56 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:56 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:56 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:56.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:57.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:57 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:57 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:58 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:58 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:58 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:58 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:57:58 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:57:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:57:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:57:58.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:57:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:57:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:57:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:57:59.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:57:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:57:59 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:57:59 2025: (VI_0) received an invalid passwd!
Oct 10 05:57:59 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:58:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:00 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:00 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:00 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:00 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:00 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:58:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:00.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:58:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:01.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:01 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:01 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:02 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:02 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:02 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:02 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:02 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:02.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:58:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:03.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:58:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:03 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:03 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:03 np0005479823 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Oct 10 05:58:03 np0005479823 podman[149088]: 2025-10-10 09:58:03.893691784 +0000 UTC m=+0.163753292 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Oct 10 05:58:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:04 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:04 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:04 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:04 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:04 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:58:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:04.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:58:04 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:58:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:58:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:05.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:58:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:05 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:05 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:06 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:06 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:06 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:06 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:06 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:06 np0005479823 kernel: SELinux:  Converting 2770 SID table entries...
Oct 10 05:58:06 np0005479823 kernel: SELinux:  policy capability network_peer_controls=1
Oct 10 05:58:06 np0005479823 kernel: SELinux:  policy capability open_perms=1
Oct 10 05:58:06 np0005479823 kernel: SELinux:  policy capability extended_socket_class=1
Oct 10 05:58:06 np0005479823 kernel: SELinux:  policy capability always_check_network=0
Oct 10 05:58:06 np0005479823 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 10 05:58:06 np0005479823 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 10 05:58:06 np0005479823 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 10 05:58:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:58:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:06.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:58:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:58:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:07.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:58:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:07 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:07 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:08 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:08 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:08 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:08 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:08 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:08.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:09.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:09 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:09 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:09 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:58:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:10 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:10 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:10 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:10 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:10 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:58:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:10.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:58:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:58:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:11.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:58:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:11 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:11 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:11 np0005479823 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Oct 10 05:58:11 np0005479823 podman[149131]: 2025-10-10 09:58:11.793999485 +0000 UTC m=+0.059415231 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 10 05:58:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:12 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:12 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:12 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:12 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:12 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:12.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:58:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:13.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:58:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:13 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:13 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:14 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:14 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:14 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:14 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:14 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:14.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:14 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:58:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:15.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:15 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:15 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:16 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:16 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:16 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:16 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:16 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:58:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:16.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:58:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:58:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:17.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:58:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:17 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:17 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:18 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:18 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:18 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:18 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:18 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:18.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:58:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:19.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:58:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:19 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:19 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:19 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:58:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:20 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:20 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:20 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:20 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:20 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:58:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:20.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:58:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:21.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:21 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:21 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:22 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:22 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:22 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:22 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:22 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:22.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:58:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:23.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:58:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:23 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:23 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:24 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:24 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:24 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:24 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:24 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:24.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:24 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:58:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:58:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:25.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:58:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:25 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:25 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:26 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:26 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:26 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:26 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:26 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:26.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:27.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:27 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:27 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:28 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:28 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:28 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:28 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:28 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:58:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:28.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:58:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:58:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:29.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:58:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:29 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:29 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:29 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:58:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:30 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:30 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:30 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:30 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:30 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:30.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:58:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:31.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:58:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:31 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:31 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:32 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:32 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:32 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:32 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:32 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:32.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:33.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:33 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:33 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:34 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:34 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:34 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:34 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:34 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:34 np0005479823 podman[160060]: 2025-10-10 09:58:34.809646 +0000 UTC m=+0.084761653 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 10 05:58:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:34.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:34 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:58:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:35.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:35 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:35 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:36 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:36 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:36.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:58:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:37.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:58:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:37 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:37 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:38 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:38 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c00a2b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:38 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:38 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:38 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:38.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:58:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:39.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:58:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:39 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:39 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:39 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:58:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:40 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:40 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:40 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:40 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:40 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:40.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:58:41.445 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 05:58:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:58:41.446 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 05:58:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:58:41.446 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 05:58:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:41.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:41 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:41 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:42 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:42 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:42 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:42 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:42 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:42 np0005479823 podman[165801]: 2025-10-10 09:58:42.782863567 +0000 UTC m=+0.056374844 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 10 05:58:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:42.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:43.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:43 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:43 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:44 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:44 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:44 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:44 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:44 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:58:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:44.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:58:44 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:58:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:45.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:45 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:45 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:45 np0005479823 podman[166151]: 2025-10-10 09:58:45.626973247 +0000 UTC m=+0.134119068 container exec bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Oct 10 05:58:45 np0005479823 podman[166151]: 2025-10-10 09:58:45.73021688 +0000 UTC m=+0.237362701 container exec_died bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2)
Oct 10 05:58:45 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Oct 10 05:58:46 np0005479823 podman[166275]: 2025-10-10 09:58:46.202618624 +0000 UTC m=+0.089980450 container exec 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 10 05:58:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:46 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:46 np0005479823 podman[166304]: 2025-10-10 09:58:46.274015692 +0000 UTC m=+0.054300533 container exec_died 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 10 05:58:46 np0005479823 podman[166275]: 2025-10-10 09:58:46.341597236 +0000 UTC m=+0.228959072 container exec_died 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 10 05:58:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:46 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:46 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:46 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:46 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:46 np0005479823 podman[166373]: 2025-10-10 09:58:46.938204612 +0000 UTC m=+0.105552617 container exec d581a237d23fa331a7186f793a936d9a26d2be9b5cfdac77842068423ea84792 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1, ceph=True)
Oct 10 05:58:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:46.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:47 np0005479823 podman[166394]: 2025-10-10 09:58:47.013045879 +0000 UTC m=+0.056126981 container exec_died d581a237d23fa331a7186f793a936d9a26d2be9b5cfdac77842068423ea84792 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 05:58:47 np0005479823 podman[166373]: 2025-10-10 09:58:47.039842833 +0000 UTC m=+0.207190808 container exec_died d581a237d23fa331a7186f793a936d9a26d2be9b5cfdac77842068423ea84792 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0)
Oct 10 05:58:47 np0005479823 podman[166437]: 2025-10-10 09:58:47.467280024 +0000 UTC m=+0.221243046 container exec 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0 (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol)
Oct 10 05:58:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:47.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:47 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:47 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:47 np0005479823 podman[166459]: 2025-10-10 09:58:47.628032441 +0000 UTC m=+0.053642612 container exec_died 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0 (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol)
Oct 10 05:58:47 np0005479823 podman[166437]: 2025-10-10 09:58:47.68163559 +0000 UTC m=+0.435598592 container exec_died 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0 (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol)
Oct 10 05:58:47 np0005479823 podman[166506]: 2025-10-10 09:58:47.92497495 +0000 UTC m=+0.050061407 container exec 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=Ceph keepalived, com.redhat.component=keepalived-container, build-date=2023-02-22T09:23:20, release=1793, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, description=keepalived for Ceph, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=)
Oct 10 05:58:48 np0005479823 podman[166527]: 2025-10-10 09:58:48.052041293 +0000 UTC m=+0.050494471 container exec_died 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, version=2.2.4, com.redhat.component=keepalived-container, io.openshift.expose-services=, architecture=x86_64, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, description=keepalived for Ceph, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Oct 10 05:58:48 np0005479823 podman[166506]: 2025-10-10 09:58:48.063166348 +0000 UTC m=+0.188252785 container exec_died 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm, version=2.2.4, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2, com.redhat.component=keepalived-container, build-date=2023-02-22T09:23:20, distribution-scope=public, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, io.openshift.expose-services=)
Oct 10 05:58:48 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:58:48 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:58:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:48 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:48 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:48 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:48 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:48 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:58:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:48.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:58:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:58:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:49.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:58:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:49 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:49 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:49 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Oct 10 05:58:49 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:58:49 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:58:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:58:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:50 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:50 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:50 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:50 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:50 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:50.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:51.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:51 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:51 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:51 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Oct 10 05:58:51 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 05:58:51 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:58:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:52 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:52 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:52 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:52 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:52 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:52 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:58:52 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 05:58:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:52.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:53.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:53 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:53 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:54 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:54 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:54 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:54 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:54 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:54.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:58:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:55.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:55 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:55 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:56 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:56 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:56 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:56 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:56 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:56.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:58:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:57.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:58:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:57 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:57 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:58 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003e10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:58 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:58 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:58 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:58 np0005479823 kernel: SELinux:  Converting 2771 SID table entries...
Oct 10 05:58:58 np0005479823 kernel: SELinux:  policy capability network_peer_controls=1
Oct 10 05:58:58 np0005479823 kernel: SELinux:  policy capability open_perms=1
Oct 10 05:58:58 np0005479823 kernel: SELinux:  policy capability extended_socket_class=1
Oct 10 05:58:58 np0005479823 kernel: SELinux:  policy capability always_check_network=0
Oct 10 05:58:58 np0005479823 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 10 05:58:58 np0005479823 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 10 05:58:58 np0005479823 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 10 05:58:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:58:58 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:58:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000064s ======
Oct 10 05:58:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:58:58.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000064s
Oct 10 05:58:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:58:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:58:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:58:59.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:58:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:58:59 2025: (VI_0) received an invalid passwd!
Oct 10 05:58:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:58:59 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:59:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:00 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:00 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8003e30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:00 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:00 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:00 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:00.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:01 np0005479823 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Oct 10 05:59:01 np0005479823 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Oct 10 05:59:01 np0005479823 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Oct 10 05:59:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:59:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:01.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:59:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:01 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:01 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:02 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:02 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:02 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:02 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:02 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:02.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:03.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:03 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:59:03 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 05:59:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:03 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:03 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:04 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:04 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:04 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:04 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:04 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:04.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:05 np0005479823 podman[166870]: 2025-10-10 09:59:05.046438183 +0000 UTC m=+0.080493058 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct 10 05:59:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:59:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:05.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:05 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:05 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:06 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:06 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:06 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:06 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:06 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:59:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:06.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:59:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:59:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:07.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:59:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:07 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:07 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:08 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:08 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:08 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:08 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:08 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:59:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:08.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:59:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:09.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:09 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:09 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:10 np0005479823 systemd[1]: Stopping OpenSSH server daemon...
Oct 10 05:59:10 np0005479823 systemd[1]: sshd.service: Deactivated successfully.
Oct 10 05:59:10 np0005479823 systemd[1]: Stopped OpenSSH server daemon.
Oct 10 05:59:10 np0005479823 systemd[1]: sshd.service: Consumed 2.319s CPU time, read 0B from disk, written 4.0K to disk.
Oct 10 05:59:10 np0005479823 systemd[1]: Stopped target sshd-keygen.target.
Oct 10 05:59:10 np0005479823 systemd[1]: Stopping sshd-keygen.target...
Oct 10 05:59:10 np0005479823 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 10 05:59:10 np0005479823 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 10 05:59:10 np0005479823 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 10 05:59:10 np0005479823 systemd[1]: Reached target sshd-keygen.target.
Oct 10 05:59:10 np0005479823 systemd[1]: Starting OpenSSH server daemon...
Oct 10 05:59:10 np0005479823 systemd[1]: Started OpenSSH server daemon.
Oct 10 05:59:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:59:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:10 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:10 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:10 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:10 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:10 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:10.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:11.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:11 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:11 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:11 np0005479823 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 10 05:59:11 np0005479823 systemd[1]: Starting man-db-cache-update.service...
Oct 10 05:59:12 np0005479823 systemd[1]: Reloading.
Oct 10 05:59:12 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:59:12 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:59:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:12 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:12 np0005479823 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 10 05:59:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:12 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:12 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:12 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:12 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:12.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:59:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:13.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:59:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:13 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:13 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:13 np0005479823 podman[169588]: 2025-10-10 09:59:13.688821789 +0000 UTC m=+0.079842277 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct 10 05:59:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:14 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:14 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:14 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:14 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:14 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:14.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:59:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:59:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:15.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:59:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:15 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:15 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:16 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:16 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:16 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:16 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:16 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:16 np0005479823 systemd[1]: Starting PackageKit Daemon...
Oct 10 05:59:16 np0005479823 systemd[1]: Started PackageKit Daemon.
Oct 10 05:59:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:16.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:17.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:17 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:17 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:18 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:18 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:18 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:18 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:18 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:18.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:19.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:19 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:19 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:19 np0005479823 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 10 05:59:19 np0005479823 systemd[1]: Finished man-db-cache-update.service.
Oct 10 05:59:19 np0005479823 systemd[1]: man-db-cache-update.service: Consumed 9.823s CPU time.
Oct 10 05:59:19 np0005479823 systemd[1]: run-r20f4e3d58f8d41ba9f126ba5b28f19ed.service: Deactivated successfully.
Oct 10 05:59:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:59:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:20 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:20 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:20 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:20 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:20 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:59:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:20.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:59:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:21.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:21 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:21 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:22 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:22 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:22 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:22 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:22 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:59:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:22.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:59:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:23 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:23 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:23.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:24 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:24 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:24 np0005479823 python3.9[176379]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 10 05:59:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:24 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:24 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:24 np0005479823 systemd[1]: Reloading.
Oct 10 05:59:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:24 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:24 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:59:24 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:59:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:24.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:59:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:25 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:25 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:25.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:25 np0005479823 python3.9[176570]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 10 05:59:25 np0005479823 systemd[1]: Reloading.
Oct 10 05:59:25 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:59:25 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:59:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:26 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:26 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:26 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:26 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:26 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:26 np0005479823 python3.9[176760]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 10 05:59:26 np0005479823 systemd[1]: Reloading.
Oct 10 05:59:26 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:59:26 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:59:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:26.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:27 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:27 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:59:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:27.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:59:27 np0005479823 python3.9[176952]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 10 05:59:27 np0005479823 systemd[1]: Reloading.
Oct 10 05:59:27 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:59:27 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:59:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:28 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:28 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:28 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:28 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:28 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:59:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:28.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:59:29 np0005479823 python3.9[177144]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 05:59:29 np0005479823 systemd[1]: Reloading.
Oct 10 05:59:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:29 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:29 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:29.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:29 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:59:29 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:59:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:59:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:30 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:30 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:30 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:30 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:30 np0005479823 python3.9[177334]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 05:59:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:30 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:30 np0005479823 systemd[1]: Reloading.
Oct 10 05:59:30 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:59:30 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:59:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:30.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:31 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:31 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:31.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:31 np0005479823 python3.9[177526]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 05:59:31 np0005479823 systemd[1]: Reloading.
Oct 10 05:59:31 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:59:31 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:59:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:32 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:32 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:32 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:32 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:32 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:32 np0005479823 python3.9[177717]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 05:59:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:32.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:33 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:33 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:33.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:33 np0005479823 python3.9[177873]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 05:59:33 np0005479823 systemd[1]: Reloading.
Oct 10 05:59:34 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:59:34 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:59:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:34 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:34 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:34 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:34 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:34 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40036a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:59:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:34.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:59:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:59:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:35 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:35 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:35.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:35 np0005479823 podman[177963]: 2025-10-10 09:59:35.884153317 +0000 UTC m=+0.144655024 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Oct 10 05:59:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:36 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:36 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:36 np0005479823 python3.9[178116]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 10 05:59:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:36 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:36 np0005479823 systemd[1]: Reloading.
Oct 10 05:59:36 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 05:59:36 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 05:59:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:59:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:36.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:59:37 np0005479823 systemd[1]: Listening on libvirt proxy daemon socket.
Oct 10 05:59:37 np0005479823 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Oct 10 05:59:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:37 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:37 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:59:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:37.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:59:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:38 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8dc003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:38 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:38 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:38 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:38 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:38.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:39 np0005479823 python3.9[178314]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 05:59:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:39 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:39 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:39.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:39 np0005479823 python3.9[178470]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 05:59:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:59:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:40 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:40 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:40 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:40 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:40 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003c30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:40 np0005479823 python3.9[178626]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 05:59:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:40.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:59:41.447 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 05:59:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:59:41.447 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 05:59:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 09:59:41.447 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 05:59:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:41 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:41 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:59:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:41.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:59:41 np0005479823 python3.9[178782]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 05:59:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:42 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:42 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:42 np0005479823 python3.9[178937]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 05:59:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:42 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:42 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:42 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:42.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:43 np0005479823 python3.9[179094]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 05:59:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:43 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:43 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:59:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:43.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:59:43 np0005479823 podman[179198]: 2025-10-10 09:59:43.792012188 +0000 UTC m=+0.070701975 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct 10 05:59:44 np0005479823 python3.9[179268]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 05:59:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:44 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:44 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:44 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:44 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:44 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:59:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:45.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:59:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:59:45 np0005479823 python3.9[179425]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 05:59:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:45 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:45 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:59:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:45.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:59:45 np0005479823 python3.9[179580]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 05:59:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:46 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:46 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:46 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:46 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:46 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:46 np0005479823 python3.9[179735]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 05:59:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:59:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:47.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:59:47 np0005479823 python3.9[179892]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 05:59:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:47 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:47 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:59:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:47.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:59:48 np0005479823 python3.9[180047]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 05:59:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:48 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:48 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:48 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:48 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:48 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:59:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:49.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:59:49 np0005479823 python3.9[180204]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 05:59:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:49 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:49 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:49.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:49 np0005479823 python3.9[180359]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 10 05:59:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:59:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:50 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:50 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:50 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:50 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:50 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:59:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:51.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:59:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:51 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:51 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 05:59:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:51.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 05:59:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:52 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:52 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:52 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:52 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:52 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:52 np0005479823 python3.9[180518]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:59:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:53.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:53 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:53 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:53.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:53 np0005479823 python3.9[180670]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:59:54 np0005479823 python3.9[180847]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:59:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:54 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:54 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:54 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:54 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:54 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:54 np0005479823 python3.9[181000]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:59:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:55.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 05:59:55 np0005479823 python3.9[181153]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:59:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:55 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:55 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 05:59:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:55.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 05:59:56 np0005479823 python3.9[181305]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 10 05:59:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:56 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:56 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:56 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:56 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:56 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:57.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:57 np0005479823 python3.9[181459]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:59:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:57 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:57 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:57.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:58 np0005479823 python3.9[181584]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760090396.7745173-1625-79440985962375/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:59:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:58 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e80013d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:58 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:58 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:58 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 09:59:58 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40043b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 05:59:58 np0005479823 python3.9[181737]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 05:59:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:09:59:59.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:59 np0005479823 python3.9[181863]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760090398.3343463-1625-5944362674543/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 05:59:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 09:59:59 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 09:59:59 2025: (VI_0) received an invalid passwd!
Oct 10 05:59:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 05:59:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 05:59:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:09:59:59.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 05:59:59 np0005479823 python3.9[182015]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:00:00 np0005479823 ceph-mon[74913]: overall HEALTH_OK
Oct 10 06:00:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:00:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:00 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:00 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8004690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:00 np0005479823 python3.9[182140]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760090399.5006964-1625-279787395054929/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:00 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:01.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:01 np0005479823 python3.9[182294]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:00:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:00:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:01.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:00:01 np0005479823 python3.9[182419]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760090400.7160127-1625-11903019279587/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:02 np0005479823 python3.9[182571]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:00:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:02 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e40043d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:02 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:02 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:02 np0005479823 python3.9[182697]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760090401.8734188-1625-51238216520086/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:03.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:03 np0005479823 python3.9[182920]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:00:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:03.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:04 np0005479823 python3.9[183057]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760090403.088456-1625-140997741223943/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:04 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:04 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8004690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:04 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:04 np0005479823 python3.9[183210]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:00:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:05.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:00:05 np0005479823 python3.9[183334]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760090404.2670188-1625-245643392618880/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:05.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:05 np0005479823 python3.9[183486]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:00:06 np0005479823 podman[183583]: 2025-10-10 10:00:06.248266746 +0000 UTC m=+0.085137217 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct 10 06:00:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:06 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:06 np0005479823 python3.9[183630]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1760090405.450699-1625-62421646790118/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:06 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003f90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:06 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8004690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:07.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:07 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:00:07 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:00:07 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 06:00:07 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:00:07 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:00:07 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 06:00:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:00:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:07.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:00:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:08 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:08 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e4004430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:08 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003f90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:08 np0005479823 python3.9[183792]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Oct 10 06:00:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:09.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:00:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:09.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:00:09 np0005479823 python3.9[183946]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:00:10 np0005479823 python3.9[184098]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:10 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8004690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:10 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb90c008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:10 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:10 np0005479823 python3.9[184253]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:11.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:11 np0005479823 python3.9[184406]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:11.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:12 np0005479823 python3.9[184558]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:12 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003f90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:12 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8004690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:12 np0005479823 python3.9[184710]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:12 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8e8004690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:13.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:13 np0005479823 python3.9[184889]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:13 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:00:13 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:00:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:13.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:13 np0005479823 python3.9[185041]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:13 np0005479823 podman[185090]: 2025-10-10 10:00:13.949542119 +0000 UTC m=+0.062461454 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct 10 06:00:14 np0005479823 python3.9[185235]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:14 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb908002ee0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:14 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003f90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:14 np0005479823 kernel: ganesha.nfsd[165476]: segfault at 50 ip 00007fb9bb29732e sp 00007fb9897f9210 error 4 in libntirpc.so.5.8[7fb9bb27c000+2c000] likely on CPU 0 (core 0, socket 0)
Oct 10 06:00:14 np0005479823 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Oct 10 06:00:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[148186]: 10/10/2025 10:00:14 : epoch 68e8d884 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb8fc003f90 fd 38 proxy ignored for local
Oct 10 06:00:14 np0005479823 systemd[1]: Started Process Core Dump (PID 185357/UID 0).
Oct 10 06:00:14 np0005479823 python3.9[185390]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:15.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:00:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:00:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:15.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:00:15 np0005479823 python3.9[185543]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:15 np0005479823 systemd-coredump[185361]: Process 148191 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 59:#012#0  0x00007fb9bb29732e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Oct 10 06:00:15 np0005479823 systemd[1]: systemd-coredump@5-185357-0.service: Deactivated successfully.
Oct 10 06:00:15 np0005479823 systemd[1]: systemd-coredump@5-185357-0.service: Consumed 1.149s CPU time.
Oct 10 06:00:15 np0005479823 podman[185608]: 2025-10-10 10:00:15.94643285 +0000 UTC m=+0.024743380 container died d581a237d23fa331a7186f793a936d9a26d2be9b5cfdac77842068423ea84792 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 06:00:15 np0005479823 systemd[1]: var-lib-containers-storage-overlay-03a086b1f9a52c382b0bf0c9603711827ef5e521aa04ce6dd516e78cd0a1e7bd-merged.mount: Deactivated successfully.
Oct 10 06:00:15 np0005479823 podman[185608]: 2025-10-10 10:00:15.982558162 +0000 UTC m=+0.060868692 container remove d581a237d23fa331a7186f793a936d9a26d2be9b5cfdac77842068423ea84792 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True)
Oct 10 06:00:15 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Main process exited, code=exited, status=139/n/a
Oct 10 06:00:16 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Failed with result 'exit-code'.
Oct 10 06:00:16 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.661s CPU time.
Oct 10 06:00:16 np0005479823 python3.9[185742]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:17 np0005479823 python3.9[185896]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:17.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:17 np0005479823 python3.9[186048]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:17.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:00:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:19.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:00:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:19.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:00:20 np0005479823 python3.9[186202]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:00:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100020 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 06:00:20 np0005479823 python3.9[186326]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090419.8201063-2288-62071139539853/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:21.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:21 np0005479823 python3.9[186479]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:00:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:00:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:21.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:00:21 np0005479823 python3.9[186602]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090420.9733894-2288-54614717059651/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:22 np0005479823 python3.9[186754]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:00:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:00:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:23.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:00:23 np0005479823 python3.9[186879]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090422.1163347-2288-105092995202982/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:00:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:23.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:00:23 np0005479823 python3.9[187031]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:00:24 np0005479823 python3.9[187154]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090423.3248363-2288-109096476681839/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:24 np0005479823 python3.9[187307]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:00:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:25.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:00:25 np0005479823 python3.9[187431]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090424.4402773-2288-46803250057139/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:25.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:26 np0005479823 python3.9[187583]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:00:26 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Scheduled restart job, restart counter is at 6.
Oct 10 06:00:26 np0005479823 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 06:00:26 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.661s CPU time.
Oct 10 06:00:26 np0005479823 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 06:00:26 np0005479823 podman[187699]: 2025-10-10 10:00:26.402294566 +0000 UTC m=+0.036029410 container create d2ad8466cb615ea6f40a71e4248a655d65b9d3319f4ec44bced7edb4eae94628 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 06:00:26 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d1140c1c832a7cbf54fd0203a6ee559ca50a4b6de6ddc3b0879e0b1307a09df/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Oct 10 06:00:26 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d1140c1c832a7cbf54fd0203a6ee559ca50a4b6de6ddc3b0879e0b1307a09df/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 06:00:26 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d1140c1c832a7cbf54fd0203a6ee559ca50a4b6de6ddc3b0879e0b1307a09df/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 06:00:26 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d1140c1c832a7cbf54fd0203a6ee559ca50a4b6de6ddc3b0879e0b1307a09df/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.boccfy-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 06:00:26 np0005479823 podman[187699]: 2025-10-10 10:00:26.459935314 +0000 UTC m=+0.093670178 container init d2ad8466cb615ea6f40a71e4248a655d65b9d3319f4ec44bced7edb4eae94628 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 06:00:26 np0005479823 podman[187699]: 2025-10-10 10:00:26.465516522 +0000 UTC m=+0.099251366 container start d2ad8466cb615ea6f40a71e4248a655d65b9d3319f4ec44bced7edb4eae94628 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid)
Oct 10 06:00:26 np0005479823 bash[187699]: d2ad8466cb615ea6f40a71e4248a655d65b9d3319f4ec44bced7edb4eae94628
Oct 10 06:00:26 np0005479823 podman[187699]: 2025-10-10 10:00:26.387768142 +0000 UTC m=+0.021502986 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 06:00:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Oct 10 06:00:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Oct 10 06:00:26 np0005479823 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 06:00:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Oct 10 06:00:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Oct 10 06:00:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Oct 10 06:00:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Oct 10 06:00:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Oct 10 06:00:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:00:26 np0005479823 python3.9[187771]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090425.6616993-2288-162260178341038/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:27.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:27 np0005479823 python3.9[187962]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:00:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:27.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:27 np0005479823 python3.9[188085]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090426.8202562-2288-84309736547384/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:28 np0005479823 python3.9[188237]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:00:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:28 np0005479823 python3.9[188361]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090427.9268405-2288-154365331841330/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:00:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:29.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:00:29 np0005479823 python3.9[188514]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:00:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:00:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:29.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:00:30 np0005479823 python3.9[188637]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090429.0868628-2288-88523671889339/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:00:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:30 np0005479823 python3.9[188790]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:00:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:31.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:31 np0005479823 python3.9[188914]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090430.273673-2288-89285000421732/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:31.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:31 np0005479823 python3.9[189066]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:00:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100032 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 06:00:32 np0005479823 python3.9[189189]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090431.3919141-2288-9440498015569/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:32 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:00:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:32 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:00:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:32 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:00:32 np0005479823 python3.9[189342]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:00:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:33.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:33 np0005479823 python3.9[189466]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090432.4962604-2288-168625719786527/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:00:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:33.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:00:34 np0005479823 python3.9[189618]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:00:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:34 np0005479823 python3.9[189766]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090433.583281-2288-161681802084065/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:35.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:00:35 np0005479823 python3.9[189920]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:00:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:35.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:35 np0005479823 python3.9[190043]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090434.782771-2288-92486001447859/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:36 np0005479823 podman[190069]: 2025-10-10 10:00:36.823373894 +0000 UTC m=+0.099633369 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 06:00:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:36 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:00:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:36 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:00:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:36 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:00:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:37 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:00:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:00:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:37.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:00:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:37.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:37 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:00:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:37 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:00:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:37 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:00:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:00:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:39.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:00:39 np0005479823 python3.9[190223]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 06:00:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:39.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:40 np0005479823 python3.9[190378]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Oct 10 06:00:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:00:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:00:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:41.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:00:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:00:41.448 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:00:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:00:41.449 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:00:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:00:41.449 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:00:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:41.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:41 np0005479823 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Oct 10 06:00:41 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Oct 10 06:00:41 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:41.921276) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 06:00:41 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Oct 10 06:00:41 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090441921343, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 4678, "num_deletes": 502, "total_data_size": 12897035, "memory_usage": 13062384, "flush_reason": "Manual Compaction"}
Oct 10 06:00:41 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Oct 10 06:00:41 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090441998685, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 8357485, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13343, "largest_seqno": 18016, "table_properties": {"data_size": 8339729, "index_size": 12010, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 4677, "raw_key_size": 36450, "raw_average_key_size": 19, "raw_value_size": 8303208, "raw_average_value_size": 4480, "num_data_blocks": 525, "num_entries": 1853, "num_filter_entries": 1853, "num_deletions": 502, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089994, "oldest_key_time": 1760089994, "file_creation_time": 1760090441, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:00:41 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 77754 microseconds, and 15325 cpu microseconds.
Oct 10 06:00:41 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:00:42 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:41.999039) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 8357485 bytes OK
Oct 10 06:00:42 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:41.999150) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Oct 10 06:00:42 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:42.000747) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Oct 10 06:00:42 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:42.000763) EVENT_LOG_v1 {"time_micros": 1760090442000757, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 06:00:42 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:42.000779) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 06:00:42 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 12876639, prev total WAL file size 12876639, number of live WAL files 2.
Oct 10 06:00:42 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:00:42 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:42.003972) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Oct 10 06:00:42 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 06:00:42 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(8161KB)], [27(12MB)]
Oct 10 06:00:42 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090442004031, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 21058914, "oldest_snapshot_seqno": -1}
Oct 10 06:00:42 np0005479823 python3.9[190536]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:42 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 5072 keys, 15514049 bytes, temperature: kUnknown
Oct 10 06:00:42 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090442151306, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 15514049, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15475480, "index_size": 24763, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12741, "raw_key_size": 126878, "raw_average_key_size": 25, "raw_value_size": 15378919, "raw_average_value_size": 3032, "num_data_blocks": 1042, "num_entries": 5072, "num_filter_entries": 5072, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760090442, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:00:42 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:00:42 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:42.151575) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 15514049 bytes
Oct 10 06:00:42 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:42.153549) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 142.9 rd, 105.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(8.0, 12.1 +0.0 blob) out(14.8 +0.0 blob), read-write-amplify(4.4) write-amplify(1.9) OK, records in: 6094, records dropped: 1022 output_compression: NoCompression
Oct 10 06:00:42 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:42.153601) EVENT_LOG_v1 {"time_micros": 1760090442153579, "job": 14, "event": "compaction_finished", "compaction_time_micros": 147377, "compaction_time_cpu_micros": 27757, "output_level": 6, "num_output_files": 1, "total_output_size": 15514049, "num_input_records": 6094, "num_output_records": 5072, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 06:00:42 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:00:42 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090442155041, "job": 14, "event": "table_file_deletion", "file_number": 29}
Oct 10 06:00:42 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:00:42 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090442156973, "job": 14, "event": "table_file_deletion", "file_number": 27}
Oct 10 06:00:42 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:42.003884) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:00:42 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:42.157055) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:00:42 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:42.157061) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:00:42 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:42.157062) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:00:42 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:42.157064) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:00:42 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:00:42.157065) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:00:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:42 np0005479823 python3.9[190689]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:43.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:43 np0005479823 python3.9[190842]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:43.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:43 np0005479823 python3.9[190994]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 06:00:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:43 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:00:44 np0005479823 podman[191130]: 2025-10-10 10:00:44.311853291 +0000 UTC m=+0.060501751 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 06:00:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:44 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d28000df0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:44 np0005479823 python3.9[191175]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:44 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:44 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:00:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:45.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:00:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:00:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:45.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:45 np0005479823 python3.9[191334]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:46 np0005479823 python3.9[191486]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:46 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:46 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100046 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 06:00:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:46 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:46 np0005479823 python3.9[191639]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:46 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:00:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:46 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:00:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:47.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:47 np0005479823 python3.9[191792]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:47.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:48 np0005479823 python3.9[191944]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:48 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:48 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d040016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:48 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:00:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:49.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:00:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:49.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:49 np0005479823 python3.9[192098]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 06:00:49 np0005479823 systemd[1]: Reloading.
Oct 10 06:00:49 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 06:00:49 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 06:00:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:49 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 06:00:49 np0005479823 systemd[1]: Starting libvirt logging daemon socket...
Oct 10 06:00:49 np0005479823 systemd[1]: Listening on libvirt logging daemon socket.
Oct 10 06:00:49 np0005479823 systemd[1]: Starting libvirt logging daemon admin socket...
Oct 10 06:00:50 np0005479823 systemd[1]: Listening on libvirt logging daemon admin socket.
Oct 10 06:00:50 np0005479823 systemd[1]: Starting libvirt logging daemon...
Oct 10 06:00:50 np0005479823 systemd[1]: Started libvirt logging daemon.
Oct 10 06:00:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:00:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:50 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d14001720 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:50 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:50 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d040016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:50 np0005479823 python3.9[192292]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 06:00:50 np0005479823 systemd[1]: Reloading.
Oct 10 06:00:50 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 06:00:50 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 06:00:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:51.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:51 np0005479823 systemd[1]: Starting libvirt nodedev daemon socket...
Oct 10 06:00:51 np0005479823 systemd[1]: Listening on libvirt nodedev daemon socket.
Oct 10 06:00:51 np0005479823 systemd[1]: Starting libvirt nodedev daemon admin socket...
Oct 10 06:00:51 np0005479823 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Oct 10 06:00:51 np0005479823 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Oct 10 06:00:51 np0005479823 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Oct 10 06:00:51 np0005479823 systemd[1]: Starting libvirt nodedev daemon...
Oct 10 06:00:51 np0005479823 systemd[1]: Started libvirt nodedev daemon.
Oct 10 06:00:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:51.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:51 np0005479823 python3.9[192508]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 06:00:51 np0005479823 systemd[1]: Reloading.
Oct 10 06:00:52 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 06:00:52 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 06:00:52 np0005479823 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Oct 10 06:00:52 np0005479823 systemd[1]: Starting libvirt proxy daemon admin socket...
Oct 10 06:00:52 np0005479823 systemd[1]: Starting libvirt proxy daemon read-only socket...
Oct 10 06:00:52 np0005479823 systemd[1]: Listening on libvirt proxy daemon admin socket.
Oct 10 06:00:52 np0005479823 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Oct 10 06:00:52 np0005479823 systemd[1]: Starting libvirt proxy daemon...
Oct 10 06:00:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100052 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 06:00:52 np0005479823 systemd[1]: Started libvirt proxy daemon.
Oct 10 06:00:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:52 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:52 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d14002240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:52 np0005479823 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Oct 10 06:00:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:52 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:52 np0005479823 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Oct 10 06:00:52 np0005479823 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Oct 10 06:00:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:53.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:53 np0005479823 python3.9[192727]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 06:00:53 np0005479823 systemd[1]: Reloading.
Oct 10 06:00:53 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 06:00:53 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 06:00:53 np0005479823 systemd[1]: Listening on libvirt locking daemon socket.
Oct 10 06:00:53 np0005479823 systemd[1]: Starting libvirt QEMU daemon socket...
Oct 10 06:00:53 np0005479823 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Oct 10 06:00:53 np0005479823 systemd[1]: Starting Virtual Machine and Container Registration Service...
Oct 10 06:00:53 np0005479823 systemd[1]: Listening on libvirt QEMU daemon socket.
Oct 10 06:00:53 np0005479823 systemd[1]: Starting libvirt QEMU daemon admin socket...
Oct 10 06:00:53 np0005479823 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Oct 10 06:00:53 np0005479823 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Oct 10 06:00:53 np0005479823 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Oct 10 06:00:53 np0005479823 systemd[1]: Started Virtual Machine and Container Registration Service.
Oct 10 06:00:53 np0005479823 systemd[1]: Starting libvirt QEMU daemon...
Oct 10 06:00:53 np0005479823 systemd[1]: Started libvirt QEMU daemon.
Oct 10 06:00:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:53.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:53 np0005479823 setroubleshoot[192545]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 41dc1415-034a-47c4-9f0f-7f67ccec6a71
Oct 10 06:00:53 np0005479823 setroubleshoot[192545]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Oct 10 06:00:53 np0005479823 setroubleshoot[192545]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 41dc1415-034a-47c4-9f0f-7f67ccec6a71
Oct 10 06:00:53 np0005479823 setroubleshoot[192545]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Oct 10 06:00:54 np0005479823 python3.9[192942]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 06:00:54 np0005479823 systemd[1]: Reloading.
Oct 10 06:00:54 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 06:00:54 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 06:00:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:54 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d040016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:54 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:54 np0005479823 systemd[1]: Starting libvirt secret daemon socket...
Oct 10 06:00:54 np0005479823 systemd[1]: Listening on libvirt secret daemon socket.
Oct 10 06:00:54 np0005479823 systemd[1]: Starting libvirt secret daemon admin socket...
Oct 10 06:00:54 np0005479823 systemd[1]: Starting libvirt secret daemon read-only socket...
Oct 10 06:00:54 np0005479823 systemd[1]: Listening on libvirt secret daemon admin socket.
Oct 10 06:00:54 np0005479823 systemd[1]: Listening on libvirt secret daemon read-only socket.
Oct 10 06:00:54 np0005479823 systemd[1]: Starting libvirt secret daemon...
Oct 10 06:00:54 np0005479823 systemd[1]: Started libvirt secret daemon.
Oct 10 06:00:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:54 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d14002240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:00:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:55.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:00:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:00:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:55.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:56 np0005479823 python3.9[193179]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:00:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:56 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:56 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:56 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:57.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:57 np0005479823 python3.9[193333]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 10 06:00:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:57.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:58 np0005479823 python3.9[193485]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 06:00:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:58 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d14002240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:58 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:00:58 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:00:58 np0005479823 python3.9[193640]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 10 06:00:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:00:59.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:00:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:00:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:00:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:00:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:00:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:00:59.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:00:59 np0005479823 python3.9[193791]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:01:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:01:00 np0005479823 auditd[702]: Audit daemon rotating log files
Oct 10 06:01:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:00 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc0032f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:00 np0005479823 python3.9[193912]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090459.4982643-3363-174894785370782/.source.xml follow=False _original_basename=secret.xml.j2 checksum=baa25a2f67c100fe0cd0e069ccc25ef935446dd6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:01:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:00 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:00 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:01.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:01:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:01.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:01:01 np0005479823 python3.9[194066]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 21f084a3-af34-5230-afe4-ea5cd24a55f4#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 06:01:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:02 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:02 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc0032f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:02 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:02 np0005479823 python3.9[194244]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:01:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:03.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:03.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:03 np0005479823 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Oct 10 06:01:03 np0005479823 systemd[1]: setroubleshootd.service: Deactivated successfully.
Oct 10 06:01:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100104 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 06:01:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:04 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:04 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:04 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:05.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:01:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:01:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:05.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:01:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:06 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:06 np0005479823 python3.9[194710]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:01:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:06 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:06 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:07 np0005479823 podman[194836]: 2025-10-10 10:01:07.070992408 +0000 UTC m=+0.102333748 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct 10 06:01:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:07.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:07 np0005479823 python3.9[194881]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:01:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:01:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:07.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:01:07 np0005479823 python3.9[195013]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090466.7195778-3527-240861844818289/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:01:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:08 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:08 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:08 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:08 np0005479823 python3.9[195166]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:01:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:01:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:09.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:01:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100109 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 06:01:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:01:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:09.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:01:09 np0005479823 python3.9[195319]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:01:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:01:10 np0005479823 python3.9[195397]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:01:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:10 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:10 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:10 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:01:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:11.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:01:11 np0005479823 python3.9[195551]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:01:11 np0005479823 python3.9[195629]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.c9eebzku recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:01:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:01:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:11.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:01:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:12 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c003cc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:12 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:01:12 np0005479823 python3.9[195781]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:01:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:12 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:12 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:12 np0005479823 python3.9[195860]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:01:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:13.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:13.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:13 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 06:01:13 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:01:13 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:01:13 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 06:01:13 np0005479823 python3.9[196094]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 06:01:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:14 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:14 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c003cc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:14 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:14 np0005479823 podman[196245]: 2025-10-10 10:01:14.698893088 +0000 UTC m=+0.048922408 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct 10 06:01:14 np0005479823 python3[196292]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 10 06:01:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:01:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:15.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:01:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:01:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:15 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:01:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:15 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:01:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:15 np0005479823 python3.9[196446]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:01:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:15.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:15 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:01:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:15 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:01:16 np0005479823 python3.9[196524]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:01:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:16 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:16 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:16 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d08000fa0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:17.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:17 np0005479823 python3.9[196681]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:01:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:17 np0005479823 python3.9[196759]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:01:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:17.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:18 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:18 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:18 np0005479823 python3.9[196911]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:01:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:18 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:18 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 06:01:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:01:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:19.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:01:19 np0005479823 python3.9[196992]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:01:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:19.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:19 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:01:19 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:01:19 np0005479823 python3.9[197168]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:01:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:01:20 np0005479823 python3.9[197246]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:01:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:20 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d08001aa0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:20 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf80016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:20 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:01:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:21.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:01:21 np0005479823 python3.9[197400]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:01:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:01:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:21.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:01:21 np0005479823 python3.9[197525]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1760090480.780365-3902-157953605388440/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:01:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:21 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:01:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:22 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:22 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d08001aa0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:22 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf80016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:22 np0005479823 python3.9[197678]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:01:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:01:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:23.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:01:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:23 np0005479823 python3.9[197831]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 06:01:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:23.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.788446) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090483788495, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 658, "num_deletes": 252, "total_data_size": 1234293, "memory_usage": 1252024, "flush_reason": "Manual Compaction"}
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090483795751, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 571764, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18021, "largest_seqno": 18674, "table_properties": {"data_size": 568864, "index_size": 872, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7595, "raw_average_key_size": 19, "raw_value_size": 562820, "raw_average_value_size": 1481, "num_data_blocks": 38, "num_entries": 380, "num_filter_entries": 380, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760090442, "oldest_key_time": 1760090442, "file_creation_time": 1760090483, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 7426 microseconds, and 2627 cpu microseconds.
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.795878) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 571764 bytes OK
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.795899) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.797328) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.797343) EVENT_LOG_v1 {"time_micros": 1760090483797338, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.797361) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 1230683, prev total WAL file size 1230683, number of live WAL files 2.
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.797994) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323531' seq:72057594037927935, type:22 .. '6D67727374617400353034' seq:0, type:0; will stop at (end)
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(558KB)], [30(14MB)]
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090483798075, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 16085813, "oldest_snapshot_seqno": -1}
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4950 keys, 12217035 bytes, temperature: kUnknown
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090483866320, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 12217035, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12183408, "index_size": 20141, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12421, "raw_key_size": 124728, "raw_average_key_size": 25, "raw_value_size": 12093018, "raw_average_value_size": 2443, "num_data_blocks": 840, "num_entries": 4950, "num_filter_entries": 4950, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760090483, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.866680) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 12217035 bytes
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.868509) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 235.3 rd, 178.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 14.8 +0.0 blob) out(11.7 +0.0 blob), read-write-amplify(49.5) write-amplify(21.4) OK, records in: 5452, records dropped: 502 output_compression: NoCompression
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.868549) EVENT_LOG_v1 {"time_micros": 1760090483868535, "job": 16, "event": "compaction_finished", "compaction_time_micros": 68375, "compaction_time_cpu_micros": 25086, "output_level": 6, "num_output_files": 1, "total_output_size": 12217035, "num_input_records": 5452, "num_output_records": 4950, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090483868821, "job": 16, "event": "table_file_deletion", "file_number": 32}
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090483871760, "job": 16, "event": "table_file_deletion", "file_number": 30}
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.797873) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.871938) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.871943) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.871945) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.871946) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:01:23 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:01:23.871948) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:01:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100124 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 06:01:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:24 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:24 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:24 np0005479823 python3.9[197986]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:01:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:24 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d080027b0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:24 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:01:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:24 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:01:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:25.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:01:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:25 np0005479823 python3.9[198140]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 06:01:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:25.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf80016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:26 np0005479823 python3.9[198293]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 06:01:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:27.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:27 np0005479823 python3.9[198449]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 06:01:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:27.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:27 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 06:01:28 np0005479823 python3.9[198604]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:01:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:28 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d08002930 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:28 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:28 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:29 np0005479823 python3.9[198758]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:01:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:29.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:29 np0005479823 python3.9[198881]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090488.543982-4117-275162118658175/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:01:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:29.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:01:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:30 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:30 np0005479823 python3.9[199033]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:01:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:30 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d08003250 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:30 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:31 np0005479823 python3.9[199158]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090490.080768-4163-241776073727949/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:01:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:01:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:31.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:01:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100131 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 06:01:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:01:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:31.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:01:32 np0005479823 python3.9[199310]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:01:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:32 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:32 np0005479823 python3.9[199433]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090491.5657253-4208-253424999067681/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:01:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:32 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:32 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d08003250 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:01:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:33.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:01:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:33.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:33 np0005479823 python3.9[199587]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 06:01:33 np0005479823 systemd[1]: Reloading.
Oct 10 06:01:33 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 06:01:33 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 06:01:34 np0005479823 systemd[1]: Reached target edpm_libvirt.target.
Oct 10 06:01:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100134 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 06:01:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:34 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:34 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:34 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:34 np0005479823 python3.9[199804]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct 10 06:01:35 np0005479823 systemd[1]: Reloading.
Oct 10 06:01:35 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 06:01:35 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 06:01:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:01:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:35.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:01:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:01:35 np0005479823 systemd[1]: Reloading.
Oct 10 06:01:35 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 06:01:35 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 06:01:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:35.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:36 np0005479823 systemd[1]: session-53.scope: Deactivated successfully.
Oct 10 06:01:36 np0005479823 systemd[1]: session-53.scope: Consumed 3min 20.626s CPU time.
Oct 10 06:01:36 np0005479823 systemd-logind[796]: Session 53 logged out. Waiting for processes to exit.
Oct 10 06:01:36 np0005479823 systemd-logind[796]: Removed session 53.
Oct 10 06:01:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:36 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d08003250 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:36 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:36 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:37.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:37.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:37 np0005479823 podman[199905]: 2025-10-10 10:01:37.818555468 +0000 UTC m=+0.096757100 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller)
Oct 10 06:01:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:38 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:38 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d080042f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:38 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:39.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:01:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:39.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:01:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:01:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:40 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:40 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:40 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:41.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:01:41.450 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:01:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:01:41.451 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:01:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:01:41.451 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:01:41 np0005479823 systemd-logind[796]: New session 54 of user zuul.
Oct 10 06:01:41 np0005479823 systemd[1]: Started Session 54 of User zuul.
Oct 10 06:01:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:41.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:42 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:42 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:42 np0005479823 python3.9[200089]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 06:01:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:42 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:42 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:01:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:43.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:43.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:44 np0005479823 python3.9[200247]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:01:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:44 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:44 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:44 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:44 np0005479823 python3.9[200400]: ansible-ansible.builtin.file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:01:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:45.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:01:45 np0005479823 podman[200525]: 2025-10-10 10:01:45.271889112 +0000 UTC m=+0.049501196 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct 10 06:01:45 np0005479823 python3.9[200570]: ansible-ansible.builtin.file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:01:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:45 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:01:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:45 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:01:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:45.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:46 np0005479823 python3.9[200722]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 10 06:01:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:46 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:46 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:46 np0005479823 python3.9[200874]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data/ansible-generated/iscsid setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:01:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:46 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:47.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:47.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:48 np0005479823 python3.9[201028]: ansible-ansible.builtin.stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 06:01:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:48 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:48 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:48 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:48 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 06:01:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:49.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:49 np0005479823 python3.9[201186]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsid.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 06:01:49 np0005479823 systemd[1]: Reloading.
Oct 10 06:01:49 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 06:01:49 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 06:01:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:49.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:01:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:50 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:50 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:50 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:50 np0005479823 python3.9[201375]: ansible-ansible.builtin.service_facts Invoked
Oct 10 06:01:50 np0005479823 network[201393]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 10 06:01:50 np0005479823 network[201394]: 'network-scripts' will be removed from distribution in near future.
Oct 10 06:01:50 np0005479823 network[201395]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 10 06:01:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:51.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:01:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:51.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:01:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:52 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:52 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003c30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:52 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c0013c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:53.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:53.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100154 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 06:01:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:54 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:54 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:54 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003dd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:55.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:01:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:01:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:55.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:01:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:56 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c002230 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:56 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:56 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:57 np0005479823 python3.9[201699]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsi-starter.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 06:01:57 np0005479823 systemd[1]: Reloading.
Oct 10 06:01:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:57.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:57 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 06:01:57 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 06:01:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:01:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:57.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:01:58 np0005479823 python3.9[201887]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 06:01:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:58 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003dd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:58 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c002230 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:01:58 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:01:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:01:59.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:01:59 np0005479823 python3.9[202041]: ansible-containers.podman.podman_container Invoked with command=/usr/sbin/iscsi-iname detach=False image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f name=iscsid_config rm=True tty=True executable=podman state=started debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct 10 06:01:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:01:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:01:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:01:59 np0005479823 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 06:01:59 np0005479823 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 06:01:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:01:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:01:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:01:59.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:02:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:00 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc002ea0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:00 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003dd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:00 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c002f40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:00 np0005479823 podman[202053]: 2025-10-10 10:02:00.825899874 +0000 UTC m=+1.355243996 image pull 74877095db294c27659f24e7f86074178a6f28eee68561c30e3ce4d18519e09c quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f
Oct 10 06:02:00 np0005479823 podman[202114]: 2025-10-10 10:02:00.957548844 +0000 UTC m=+0.042965288 container create 8d7a27a2e27d53ab1bc6cb02d1f315f751cd1f1b82bdf17a788f2fbbff8bd629 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid_config, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 06:02:00 np0005479823 NetworkManager[44866]: <info>  [1760090520.9787] manager: (podman0): new Bridge device (/org/freedesktop/NetworkManager/Devices/23)
Oct 10 06:02:01 np0005479823 kernel: podman0: port 1(veth0) entered blocking state
Oct 10 06:02:01 np0005479823 kernel: podman0: port 1(veth0) entered disabled state
Oct 10 06:02:01 np0005479823 kernel: veth0: entered allmulticast mode
Oct 10 06:02:01 np0005479823 kernel: veth0: entered promiscuous mode
Oct 10 06:02:01 np0005479823 kernel: podman0: port 1(veth0) entered blocking state
Oct 10 06:02:01 np0005479823 kernel: podman0: port 1(veth0) entered forwarding state
Oct 10 06:02:01 np0005479823 systemd-udevd[202134]: Network interface NamePolicy= disabled on kernel command line.
Oct 10 06:02:01 np0005479823 systemd-udevd[202132]: Network interface NamePolicy= disabled on kernel command line.
Oct 10 06:02:01 np0005479823 NetworkManager[44866]: <info>  [1760090521.0114] manager: (veth0): new Veth device (/org/freedesktop/NetworkManager/Devices/24)
Oct 10 06:02:01 np0005479823 NetworkManager[44866]: <info>  [1760090521.0133] device (veth0): carrier: link connected
Oct 10 06:02:01 np0005479823 NetworkManager[44866]: <info>  [1760090521.0137] device (podman0): carrier: link connected
Oct 10 06:02:01 np0005479823 NetworkManager[44866]: <info>  [1760090521.0243] device (podman0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 10 06:02:01 np0005479823 NetworkManager[44866]: <info>  [1760090521.0262] device (podman0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct 10 06:02:01 np0005479823 NetworkManager[44866]: <info>  [1760090521.0273] device (podman0): Activation: starting connection 'podman0' (35ed1ff3-e2d7-4b9b-a59b-c3bf7706578c)
Oct 10 06:02:01 np0005479823 NetworkManager[44866]: <info>  [1760090521.0275] device (podman0): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct 10 06:02:01 np0005479823 NetworkManager[44866]: <info>  [1760090521.0280] device (podman0): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct 10 06:02:01 np0005479823 NetworkManager[44866]: <info>  [1760090521.0284] device (podman0): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct 10 06:02:01 np0005479823 NetworkManager[44866]: <info>  [1760090521.0287] device (podman0): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct 10 06:02:01 np0005479823 podman[202114]: 2025-10-10 10:02:00.935623716 +0000 UTC m=+0.021040170 image pull 74877095db294c27659f24e7f86074178a6f28eee68561c30e3ce4d18519e09c quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f
Oct 10 06:02:01 np0005479823 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 10 06:02:01 np0005479823 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 10 06:02:01 np0005479823 NetworkManager[44866]: <info>  [1760090521.0534] device (podman0): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct 10 06:02:01 np0005479823 NetworkManager[44866]: <info>  [1760090521.0537] device (podman0): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct 10 06:02:01 np0005479823 NetworkManager[44866]: <info>  [1760090521.0547] device (podman0): Activation: successful, device activated.
Oct 10 06:02:01 np0005479823 systemd[1]: iscsi.service: Unit cannot be reloaded because it is inactive.
Oct 10 06:02:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:01.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:01 np0005479823 systemd[1]: Started libpod-conmon-8d7a27a2e27d53ab1bc6cb02d1f315f751cd1f1b82bdf17a788f2fbbff8bd629.scope.
Oct 10 06:02:01 np0005479823 systemd[1]: Started libcrun container.
Oct 10 06:02:01 np0005479823 podman[202114]: 2025-10-10 10:02:01.28417996 +0000 UTC m=+0.369596424 container init 8d7a27a2e27d53ab1bc6cb02d1f315f751cd1f1b82bdf17a788f2fbbff8bd629 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid_config, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct 10 06:02:01 np0005479823 podman[202114]: 2025-10-10 10:02:01.290518392 +0000 UTC m=+0.375934836 container start 8d7a27a2e27d53ab1bc6cb02d1f315f751cd1f1b82bdf17a788f2fbbff8bd629 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid_config, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct 10 06:02:01 np0005479823 podman[202114]: 2025-10-10 10:02:01.293954671 +0000 UTC m=+0.379371115 container attach 8d7a27a2e27d53ab1bc6cb02d1f315f751cd1f1b82bdf17a788f2fbbff8bd629 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid_config, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct 10 06:02:01 np0005479823 iscsid_config[202271]: iqn.1994-05.com.redhat:d6e1178f5fe2#015
Oct 10 06:02:01 np0005479823 systemd[1]: libpod-8d7a27a2e27d53ab1bc6cb02d1f315f751cd1f1b82bdf17a788f2fbbff8bd629.scope: Deactivated successfully.
Oct 10 06:02:01 np0005479823 podman[202114]: 2025-10-10 10:02:01.295873362 +0000 UTC m=+0.381289816 container died 8d7a27a2e27d53ab1bc6cb02d1f315f751cd1f1b82bdf17a788f2fbbff8bd629 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid_config, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 10 06:02:01 np0005479823 kernel: podman0: port 1(veth0) entered disabled state
Oct 10 06:02:01 np0005479823 kernel: veth0 (unregistering): left allmulticast mode
Oct 10 06:02:01 np0005479823 kernel: veth0 (unregistering): left promiscuous mode
Oct 10 06:02:01 np0005479823 kernel: podman0: port 1(veth0) entered disabled state
Oct 10 06:02:01 np0005479823 NetworkManager[44866]: <info>  [1760090521.3460] device (podman0): state change: activated -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 10 06:02:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:01 np0005479823 systemd[1]: run-netns-netns\x2de4731ffb\x2d6cfe\x2dede5\x2d6505\x2d7fa8772e9eb3.mount: Deactivated successfully.
Oct 10 06:02:01 np0005479823 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8d7a27a2e27d53ab1bc6cb02d1f315f751cd1f1b82bdf17a788f2fbbff8bd629-userdata-shm.mount: Deactivated successfully.
Oct 10 06:02:01 np0005479823 systemd[1]: var-lib-containers-storage-overlay-555c874926332621b9a9c5aa9a07878d9118ea4b416f93c8b49210487c5856c2-merged.mount: Deactivated successfully.
Oct 10 06:02:01 np0005479823 podman[202114]: 2025-10-10 10:02:01.660190268 +0000 UTC m=+0.745606712 container remove 8d7a27a2e27d53ab1bc6cb02d1f315f751cd1f1b82bdf17a788f2fbbff8bd629 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid_config, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 10 06:02:01 np0005479823 python3.9[202041]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman run --name iscsid_config --detach=False --rm --tty=True quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f /usr/sbin/iscsi-iname
Oct 10 06:02:01 np0005479823 systemd[1]: libpod-conmon-8d7a27a2e27d53ab1bc6cb02d1f315f751cd1f1b82bdf17a788f2fbbff8bd629.scope: Deactivated successfully.
Oct 10 06:02:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:02:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:01.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:02:01 np0005479823 python3.9[202041]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: Error generating systemd: #012DEPRECATED command:#012It is recommended to use Quadlets for running containers and pods under systemd.#012#012Please refer to podman-systemd.unit(5) for details.#012Error: iscsid_config does not refer to a container or pod: no pod with name or ID iscsid_config found: no such pod: no container with name or ID "iscsid_config" found: no such container
Oct 10 06:02:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:02 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:02 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc002ea0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:02 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003dd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:03 np0005479823 python3.9[202515]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:02:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:03.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:03.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:03 np0005479823 python3.9[202638]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090522.5644763-319-93793982205751/.source.iscsi _original_basename=.vcm9vftn follow=False checksum=9695408341163c1bdea87fa513eba8362730e33b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:02:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:04 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c002f40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:04 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:04 np0005479823 python3.9[202790]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:02:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:04 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc002ea0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:05.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:02:05 np0005479823 python3.9[202942]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/iscsid.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 06:02:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:02:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:05.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:02:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:06 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc002ea0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:06 np0005479823 python3.9[203096]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:02:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:06 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c002f40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:06 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:07.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:07 np0005479823 python3.9[203250]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:02:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:07.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:08 np0005479823 podman[203374]: 2025-10-10 10:02:08.208214368 +0000 UTC m=+0.100544421 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 06:02:08 np0005479823 python3.9[203425]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:02:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:08 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc002ea0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:08 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003dd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:08 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004040 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:08 np0005479823 python3.9[203507]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:02:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:02:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:09.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:02:09 np0005479823 python3.9[203660]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:02:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:02:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:09.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:02:10 np0005479823 python3.9[203738]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:02:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:02:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:10 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:10 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc004200 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:10 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003df0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:10 np0005479823 python3.9[203891]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:02:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:02:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:11.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:02:11 np0005479823 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 10 06:02:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:11.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:11 np0005479823 python3.9[204044]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:02:12 np0005479823 python3.9[204122]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:02:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:12 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004040 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:12 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:12 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc004200 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000064s ======
Oct 10 06:02:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:13.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000064s
Oct 10 06:02:13 np0005479823 python3.9[204276]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:02:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:13 np0005479823 python3.9[204354]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:02:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:02:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:13.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:02:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:14 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003e10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:14 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004040 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:14 np0005479823 python3.9[204506]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 06:02:14 np0005479823 systemd[1]: Reloading.
Oct 10 06:02:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:14 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:14 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 06:02:14 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 06:02:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:02:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:02:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:15.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:02:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:15.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:15 np0005479823 podman[204695]: 2025-10-10 10:02:15.781737537 +0000 UTC m=+0.056251841 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct 10 06:02:15 np0005479823 python3.9[204740]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:02:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100216 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 06:02:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:16 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:16 np0005479823 python3.9[204820]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:02:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:16 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:16 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:17.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:17 np0005479823 python3.9[204974]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:02:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:17 np0005479823 python3.9[205052]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:02:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:02:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:17.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:02:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:18 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003e50 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:18 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc004200 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:18 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc004200 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:18 np0005479823 python3.9[205204]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 06:02:18 np0005479823 systemd[1]: Reloading.
Oct 10 06:02:18 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 06:02:18 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 06:02:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:19.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:19 np0005479823 systemd[1]: Starting Create netns directory...
Oct 10 06:02:19 np0005479823 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 10 06:02:19 np0005479823 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 10 06:02:19 np0005479823 systemd[1]: Finished Create netns directory.
Oct 10 06:02:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:02:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:19.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:02:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:02:20 np0005479823 python3.9[205479]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:02:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:20 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:20 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003e50 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:20 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc004200 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:20 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 06:02:20 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:02:20 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:02:20 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 06:02:21 np0005479823 python3.9[205633]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/iscsid/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:02:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:21.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:21 np0005479823 python3.9[205756]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/iscsid/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090540.5293305-782-201508999599134/.source _original_basename=healthcheck follow=False checksum=2e1237e7fe015c809b173c52e24cfb87132f4344 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:02:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:02:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:21.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:02:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:22 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:22 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:22 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003e70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:23 np0005479823 python3.9[205910]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:02:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:23.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:23 np0005479823 python3.9[206062]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/iscsid.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:02:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:02:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:23.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:02:24 np0005479823 python3.9[206185]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/iscsid.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090543.2935624-856-230394218249425/.source.json _original_basename=.55bfguvo follow=False checksum=80e4f97460718c7e5c66b21ef8b846eba0e0dbc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:02:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:24 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc004200 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:24 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:24 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:02:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:24 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:02:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:25.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:25 np0005479823 python3.9[206339]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/iscsid state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:02:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:25.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:26 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:02:26 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:02:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003e90 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc004200 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:02:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:27.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:02:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:27 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:02:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:27 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:02:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:27.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:28 np0005479823 python3.9[206793]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/iscsid config_pattern=*.json debug=False
Oct 10 06:02:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:28 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:28 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:28 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc004200 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:29 np0005479823 python3.9[206947]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 10 06:02:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:02:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:29.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:02:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:29.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:02:30 np0005479823 python3.9[207099]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 10 06:02:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:30 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:30 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:30 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 06:02:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:30 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:31.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:31.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:32 np0005479823 python3[207280]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/iscsid config_id=iscsid config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 10 06:02:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:32 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:32 np0005479823 podman[207313]: 2025-10-10 10:02:32.497065971 +0000 UTC m=+0.023537860 image pull 74877095db294c27659f24e7f86074178a6f28eee68561c30e3ce4d18519e09c quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f
Oct 10 06:02:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:32 np0005479823 podman[207313]: 2025-10-10 10:02:32.650452394 +0000 UTC m=+0.176924263 container create e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, config_id=iscsid, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 10 06:02:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:32 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003ed0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:32 np0005479823 python3[207280]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name iscsid --conmon-pidfile /run/iscsid.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=iscsid --label container_name=iscsid --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:z --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/openstack/healthchecks/iscsid:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f
Oct 10 06:02:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:32 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:33.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:33 np0005479823 python3.9[207505]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 06:02:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:02:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:33.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:02:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:34 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc004200 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:34 np0005479823 python3.9[207659]: ansible-file Invoked with path=/etc/systemd/system/edpm_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:02:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:34 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:34 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003ef0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:35 np0005479823 python3.9[207762]: ansible-stat Invoked with path=/etc/systemd/system/edpm_iscsid_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 06:02:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:02:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:35.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:35 np0005479823 python3.9[207913]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760090555.1040416-1120-272165618370551/source dest=/etc/systemd/system/edpm_iscsid.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:02:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:02:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:35.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:02:36 np0005479823 python3.9[207989]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 10 06:02:36 np0005479823 systemd[1]: Reloading.
Oct 10 06:02:36 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 06:02:36 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 06:02:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100236 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 06:02:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:36 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:36 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc004200 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:36 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140036d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:37 np0005479823 python3.9[208103]: ansible-systemd Invoked with state=restarted name=edpm_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 06:02:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:37.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:37 np0005479823 systemd[1]: Reloading.
Oct 10 06:02:37 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 06:02:37 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 06:02:37 np0005479823 systemd[1]: Starting iscsid container...
Oct 10 06:02:37 np0005479823 systemd[1]: Started libcrun container.
Oct 10 06:02:37 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3aac090083d025eb07f24ae40c0fec22943516518b8e7fdebcf9328461a5c6a/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Oct 10 06:02:37 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3aac090083d025eb07f24ae40c0fec22943516518b8e7fdebcf9328461a5c6a/merged/etc/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 10 06:02:37 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3aac090083d025eb07f24ae40c0fec22943516518b8e7fdebcf9328461a5c6a/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 10 06:02:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:37 np0005479823 systemd[1]: Started /usr/bin/podman healthcheck run e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0.
Oct 10 06:02:37 np0005479823 podman[208142]: 2025-10-10 10:02:37.627090159 +0000 UTC m=+0.108997960 container init e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct 10 06:02:37 np0005479823 iscsid[208157]: + sudo -E kolla_set_configs
Oct 10 06:02:37 np0005479823 podman[208142]: 2025-10-10 10:02:37.665587274 +0000 UTC m=+0.147495065 container start e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, container_name=iscsid, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 10 06:02:37 np0005479823 podman[208142]: iscsid
Oct 10 06:02:37 np0005479823 systemd[1]: Started iscsid container.
Oct 10 06:02:37 np0005479823 systemd[1]: Created slice User Slice of UID 0.
Oct 10 06:02:37 np0005479823 systemd[1]: Starting User Runtime Directory /run/user/0...
Oct 10 06:02:37 np0005479823 systemd[1]: Finished User Runtime Directory /run/user/0.
Oct 10 06:02:37 np0005479823 systemd[1]: Starting User Manager for UID 0...
Oct 10 06:02:37 np0005479823 podman[208164]: 2025-10-10 10:02:37.770914027 +0000 UTC m=+0.096155561 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 10 06:02:37 np0005479823 systemd[1]: e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0-1fe2500b8305d33.service: Main process exited, code=exited, status=1/FAILURE
Oct 10 06:02:37 np0005479823 systemd[1]: e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0-1fe2500b8305d33.service: Failed with result 'exit-code'.
Oct 10 06:02:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:02:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:37.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:02:37 np0005479823 systemd[208177]: Queued start job for default target Main User Target.
Oct 10 06:02:37 np0005479823 systemd[208177]: Created slice User Application Slice.
Oct 10 06:02:37 np0005479823 systemd[208177]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct 10 06:02:37 np0005479823 systemd[208177]: Started Daily Cleanup of User's Temporary Directories.
Oct 10 06:02:37 np0005479823 systemd[208177]: Reached target Paths.
Oct 10 06:02:37 np0005479823 systemd[208177]: Reached target Timers.
Oct 10 06:02:37 np0005479823 systemd[208177]: Starting D-Bus User Message Bus Socket...
Oct 10 06:02:37 np0005479823 systemd[208177]: Starting Create User's Volatile Files and Directories...
Oct 10 06:02:37 np0005479823 systemd[208177]: Finished Create User's Volatile Files and Directories.
Oct 10 06:02:37 np0005479823 systemd[208177]: Listening on D-Bus User Message Bus Socket.
Oct 10 06:02:37 np0005479823 systemd[208177]: Reached target Sockets.
Oct 10 06:02:37 np0005479823 systemd[208177]: Reached target Basic System.
Oct 10 06:02:37 np0005479823 systemd[208177]: Reached target Main User Target.
Oct 10 06:02:37 np0005479823 systemd[208177]: Startup finished in 136ms.
Oct 10 06:02:37 np0005479823 systemd[1]: Started User Manager for UID 0.
Oct 10 06:02:37 np0005479823 systemd[1]: Started Session c3 of User root.
Oct 10 06:02:37 np0005479823 iscsid[208157]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 10 06:02:37 np0005479823 iscsid[208157]: INFO:__main__:Validating config file
Oct 10 06:02:37 np0005479823 iscsid[208157]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 10 06:02:37 np0005479823 iscsid[208157]: INFO:__main__:Writing out command to execute
Oct 10 06:02:37 np0005479823 systemd[1]: session-c3.scope: Deactivated successfully.
Oct 10 06:02:37 np0005479823 iscsid[208157]: ++ cat /run_command
Oct 10 06:02:37 np0005479823 iscsid[208157]: + CMD='/usr/sbin/iscsid -f'
Oct 10 06:02:37 np0005479823 iscsid[208157]: + ARGS=
Oct 10 06:02:37 np0005479823 iscsid[208157]: + sudo kolla_copy_cacerts
Oct 10 06:02:37 np0005479823 systemd[1]: Started Session c4 of User root.
Oct 10 06:02:37 np0005479823 iscsid[208157]: + [[ ! -n '' ]]
Oct 10 06:02:37 np0005479823 systemd[1]: session-c4.scope: Deactivated successfully.
Oct 10 06:02:37 np0005479823 iscsid[208157]: + . kolla_extend_start
Oct 10 06:02:37 np0005479823 iscsid[208157]: ++ [[ ! -f /etc/iscsi/initiatorname.iscsi ]]
Oct 10 06:02:37 np0005479823 iscsid[208157]: Running command: '/usr/sbin/iscsid -f'
Oct 10 06:02:37 np0005479823 iscsid[208157]: + echo 'Running command: '\''/usr/sbin/iscsid -f'\'''
Oct 10 06:02:37 np0005479823 iscsid[208157]: + umask 0022
Oct 10 06:02:37 np0005479823 iscsid[208157]: + exec /usr/sbin/iscsid -f
Oct 10 06:02:38 np0005479823 kernel: Loading iSCSI transport class v2.0-870.
Oct 10 06:02:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:38 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003f10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:38 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003dd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:38 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003dd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:38 np0005479823 podman[208335]: 2025-10-10 10:02:38.819565363 +0000 UTC m=+0.091346578 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller)
Oct 10 06:02:38 np0005479823 python3.9[208385]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.iscsid_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 06:02:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:02:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:39.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:02:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:39.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:39 np0005479823 python3.9[208544]: ansible-ansible.builtin.file Invoked with path=/etc/iscsi/.iscsid_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:02:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:02:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:40 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140047d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:40 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003f30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:40 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003dd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:41 np0005479823 python3.9[208697]: ansible-ansible.builtin.service_facts Invoked
Oct 10 06:02:41 np0005479823 network[208715]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 10 06:02:41 np0005479823 network[208716]: 'network-scripts' will be removed from distribution in near future.
Oct 10 06:02:41 np0005479823 network[208717]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 10 06:02:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:41.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:02:41.452 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:02:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:02:41.453 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:02:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:02:41.453 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:02:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:41.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:42 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc004200 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:42 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140047d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:42 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003f50 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:02:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:43.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:02:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:43.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:44 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003df0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:44 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003df0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:44 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140047d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:02:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:45.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:45.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:46 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003f70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:46 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8003df0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:46 np0005479823 podman[208970]: 2025-10-10 10:02:46.740102227 +0000 UTC m=+0.057779400 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 10 06:02:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:46 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140047d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:46 np0005479823 python3.9[209018]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 10 06:02:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:47.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:47.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:47 np0005479823 python3.9[209171]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Oct 10 06:02:48 np0005479823 systemd[1]: Stopping User Manager for UID 0...
Oct 10 06:02:48 np0005479823 systemd[208177]: Activating special unit Exit the Session...
Oct 10 06:02:48 np0005479823 systemd[208177]: Stopped target Main User Target.
Oct 10 06:02:48 np0005479823 systemd[208177]: Stopped target Basic System.
Oct 10 06:02:48 np0005479823 systemd[208177]: Stopped target Paths.
Oct 10 06:02:48 np0005479823 systemd[208177]: Stopped target Sockets.
Oct 10 06:02:48 np0005479823 systemd[208177]: Stopped target Timers.
Oct 10 06:02:48 np0005479823 systemd[208177]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 10 06:02:48 np0005479823 systemd[208177]: Closed D-Bus User Message Bus Socket.
Oct 10 06:02:48 np0005479823 systemd[208177]: Stopped Create User's Volatile Files and Directories.
Oct 10 06:02:48 np0005479823 systemd[208177]: Removed slice User Application Slice.
Oct 10 06:02:48 np0005479823 systemd[208177]: Reached target Shutdown.
Oct 10 06:02:48 np0005479823 systemd[208177]: Finished Exit the Session.
Oct 10 06:02:48 np0005479823 systemd[208177]: Reached target Exit the Session.
Oct 10 06:02:48 np0005479823 systemd[1]: user@0.service: Deactivated successfully.
Oct 10 06:02:48 np0005479823 systemd[1]: Stopped User Manager for UID 0.
Oct 10 06:02:48 np0005479823 systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct 10 06:02:48 np0005479823 systemd[1]: run-user-0.mount: Deactivated successfully.
Oct 10 06:02:48 np0005479823 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct 10 06:02:48 np0005479823 systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct 10 06:02:48 np0005479823 systemd[1]: Removed slice User Slice of UID 0.
Oct 10 06:02:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:48 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf0000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:48 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003f90 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:48 np0005479823 python3.9[209328]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:02:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:48 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:02:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:49.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:02:49 np0005479823 python3.9[209454]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090568.1586635-1343-220083241216947/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:02:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:49.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:02:50 np0005479823 python3.9[209606]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:02:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:50 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140047d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:50 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:50 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003fb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:51 np0005479823 python3.9[209760]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 06:02:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:02:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:51.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:02:51 np0005479823 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct 10 06:02:51 np0005479823 systemd[1]: Stopped Load Kernel Modules.
Oct 10 06:02:51 np0005479823 systemd[1]: Stopping Load Kernel Modules...
Oct 10 06:02:51 np0005479823 systemd[1]: Starting Load Kernel Modules...
Oct 10 06:02:51 np0005479823 systemd[1]: Finished Load Kernel Modules.
Oct 10 06:02:51 np0005479823 systemd[1]: virtnodedevd.service: Deactivated successfully.
Oct 10 06:02:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:51.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:52 np0005479823 python3.9[209917]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:02:52 np0005479823 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 10 06:02:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:52 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:52 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140047d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:52 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:53 np0005479823 python3.9[210072]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 06:02:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:53.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:53.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:54 np0005479823 python3.9[210224]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 06:02:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:54 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003fd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:54 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:54 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:54 np0005479823 python3.9[210377]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:02:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:02:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:55.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:55 np0005479823 python3.9[210526]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090574.301278-1516-124561907368109/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:02:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:55.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:56 np0005479823 python3.9[210678]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 06:02:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:56 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d140047d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:56 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04003ff0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:56 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:57 np0005479823 python3.9[210833]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:02:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:57.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:02:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:02:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:57.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:02:58 np0005479823 python3.9[210985]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:02:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:58 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:58 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:02:58 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04004010 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:02:58 np0005479823 python3.9[211138]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:02:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.003000096s ======
Oct 10 06:02:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:02:59.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000096s
Oct 10 06:02:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:02:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:02:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:02:59 np0005479823 python3.9[211291]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:02:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:02:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:02:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:02:59.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:03:00 np0005479823 python3.9[211443]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:03:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:00 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:00 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:00 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:00 np0005479823 python3.9[211596]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:03:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:01.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:01 np0005479823 python3.9[211749]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:03:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:01.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:02 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04004030 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:02 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c001c60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:02 np0005479823 python3.9[211902]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 06:03:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:02 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:03.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:03 np0005479823 python3.9[212057]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:03:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:03.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:04 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:04 np0005479823 python3.9[212209]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:03:04 np0005479823 systemd[1]: virtsecretd.service: Deactivated successfully.
Oct 10 06:03:04 np0005479823 systemd[1]: virtqemud.service: Deactivated successfully.
Oct 10 06:03:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:04 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04004050 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:04 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:03:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:03:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:05.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:03:05 np0005479823 python3.9[212365]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:03:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:05 np0005479823 python3.9[212443]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:03:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:03:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:05.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:03:06 np0005479823 python3.9[212595]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:03:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:06 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:06 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:06 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04004070 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:06 np0005479823 python3.9[212674]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:03:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:07.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:07.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:07 np0005479823 python3.9[212827]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:03:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:08 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d14004b30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:08 np0005479823 podman[212951]: 2025-10-10 10:03:08.555649342 +0000 UTC m=+0.055516564 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 06:03:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:08 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d14004b30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:08 np0005479823 python3.9[212997]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:03:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:08 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:08 np0005479823 podman[213051]: 2025-10-10 10:03:08.990745599 +0000 UTC m=+0.068122357 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 06:03:09 np0005479823 python3.9[213099]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:03:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:09.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100309 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 06:03:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:09.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:10 np0005479823 python3.9[213257]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:03:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:03:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:10 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04004090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:10 np0005479823 python3.9[213335]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:03:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:10 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf0003700 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:10 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d14004b50 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:11.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:11 np0005479823 python3.9[213489]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 06:03:11 np0005479823 systemd[1]: Reloading.
Oct 10 06:03:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:11 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 06:03:11 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 06:03:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:03:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:11.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:03:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:12 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:12 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d040040b0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:12 np0005479823 python3.9[213679]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:03:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:12 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf0003700 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:13 np0005479823 python3.9[213758]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:03:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:13.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.831581) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090593831641, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 1233, "num_deletes": 254, "total_data_size": 2977320, "memory_usage": 3016544, "flush_reason": "Manual Compaction"}
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Oct 10 06:03:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:03:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:13.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090593859866, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 1967335, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18679, "largest_seqno": 19907, "table_properties": {"data_size": 1962011, "index_size": 2784, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 10656, "raw_average_key_size": 18, "raw_value_size": 1951470, "raw_average_value_size": 3387, "num_data_blocks": 125, "num_entries": 576, "num_filter_entries": 576, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760090484, "oldest_key_time": 1760090484, "file_creation_time": 1760090593, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 28376 microseconds, and 5037 cpu microseconds.
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.859953) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 1967335 bytes OK
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.859987) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.862945) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.862984) EVENT_LOG_v1 {"time_micros": 1760090593862974, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.863018) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 2971483, prev total WAL file size 2971747, number of live WAL files 2.
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.864222) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323531' seq:0, type:0; will stop at (end)
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(1921KB)], [33(11MB)]
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090593864308, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 14184370, "oldest_snapshot_seqno": -1}
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 5004 keys, 13702762 bytes, temperature: kUnknown
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090593965892, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 13702762, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13667797, "index_size": 21351, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12549, "raw_key_size": 126959, "raw_average_key_size": 25, "raw_value_size": 13575589, "raw_average_value_size": 2712, "num_data_blocks": 878, "num_entries": 5004, "num_filter_entries": 5004, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760090593, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.966355) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 13702762 bytes
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.970814) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 139.2 rd, 134.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 11.7 +0.0 blob) out(13.1 +0.0 blob), read-write-amplify(14.2) write-amplify(7.0) OK, records in: 5526, records dropped: 522 output_compression: NoCompression
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.970848) EVENT_LOG_v1 {"time_micros": 1760090593970841, "job": 18, "event": "compaction_finished", "compaction_time_micros": 101863, "compaction_time_cpu_micros": 26386, "output_level": 6, "num_output_files": 1, "total_output_size": 13702762, "num_input_records": 5526, "num_output_records": 5004, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090593971205, "job": 18, "event": "table_file_deletion", "file_number": 35}
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090593973249, "job": 18, "event": "table_file_deletion", "file_number": 33}
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.864048) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.973324) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.973329) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.973331) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.973332) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:03:13 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:03:13.973333) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:03:14 np0005479823 python3.9[213910]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:03:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:14 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d14004b70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:14 np0005479823 python3.9[213988]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:03:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:14 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:14 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d040040d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:03:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:15.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:15 np0005479823 python3.9[214167]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 06:03:15 np0005479823 systemd[1]: Reloading.
Oct 10 06:03:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:15 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 06:03:15 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 06:03:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:15.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:15 np0005479823 systemd[1]: Starting Create netns directory...
Oct 10 06:03:15 np0005479823 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 10 06:03:15 np0005479823 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 10 06:03:15 np0005479823 systemd[1]: Finished Create netns directory.
Oct 10 06:03:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:16 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf0003700 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:16 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d14004b90 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:16 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:16 np0005479823 podman[214333]: 2025-10-10 10:03:16.878018299 +0000 UTC m=+0.066782695 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Oct 10 06:03:17 np0005479823 python3.9[214381]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:03:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:17.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:03:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:17.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:03:17 np0005479823 python3.9[214533]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:03:18 np0005479823 python3.9[214656]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090597.4045062-2137-14079851584351/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:03:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:18 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d040040f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:18 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:03:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:18 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d040040f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:18 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc002630 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:19.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:19 np0005479823 python3.9[214811]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:03:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:19.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:03:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:20 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:20 np0005479823 python3.9[214963]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:03:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:20 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d040040f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:20 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:21 np0005479823 python3.9[215089]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090600.1159778-2212-347685999192/.source.json _original_basename=.gg2mztjk follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:03:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:03:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:21.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:03:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:21 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:03:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:21 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:03:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:03:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:21.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:03:22 np0005479823 python3.9[215241]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:03:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:22 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc002630 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:22 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:22 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d040040f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:23.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:23.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:24 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:24 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 06:03:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:24 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc002630 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:24 np0005479823 python3.9[215670]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Oct 10 06:03:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:24 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:03:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:03:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:25.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:03:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:25 np0005479823 python3.9[215828]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 10 06:03:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:25.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d040040f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100326 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 06:03:26 np0005479823 python3.9[216056]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 10 06:03:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:26 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 06:03:26 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:03:26 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:03:26 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 06:03:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:26 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001a70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:27.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:27.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:28 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:28 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:28 np0005479823 python3[216237]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 10 06:03:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:28 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:29.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100329 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 06:03:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:29.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:29 np0005479823 podman[216251]: 2025-10-10 10:03:29.945721547 +0000 UTC m=+1.167097737 image pull f541ff382622bd8bc9ad206129d2a8e74c239ff4503fa3b67d3bdf6d5b50b511 quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43
Oct 10 06:03:30 np0005479823 podman[216309]: 2025-10-10 10:03:30.07231354 +0000 UTC m=+0.045041380 container create 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0)
Oct 10 06:03:30 np0005479823 podman[216309]: 2025-10-10 10:03:30.050820803 +0000 UTC m=+0.023548663 image pull f541ff382622bd8bc9ad206129d2a8e74c239ff4503fa3b67d3bdf6d5b50b511 quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43
Oct 10 06:03:30 np0005479823 python3[216237]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43
Oct 10 06:03:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:03:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:30 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001a70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:30 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:30 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:31 np0005479823 python3.9[216501]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 06:03:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:31.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:31.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:32 np0005479823 python3.9[216655]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:03:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:32 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:32 np0005479823 python3.9[216756]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 06:03:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:32 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:32 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001a70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:33 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:03:33 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:03:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:33.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:33 np0005479823 python3.9[216909]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760090612.642054-2476-194565114366614/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:03:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:33 np0005479823 python3.9[216985]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 10 06:03:33 np0005479823 systemd[1]: Reloading.
Oct 10 06:03:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:33.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:33 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 06:03:33 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 06:03:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:34 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04004150 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:34 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:03:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:34 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04004150 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:34 np0005479823 python3.9[217096]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 06:03:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:34 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:34 np0005479823 systemd[1]: Reloading.
Oct 10 06:03:34 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 06:03:34 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 06:03:35 np0005479823 systemd[1]: Starting multipathd container...
Oct 10 06:03:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:03:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:35 np0005479823 systemd[1]: Started libcrun container.
Oct 10 06:03:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:03:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:35.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:03:35 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7467cedc528fc4d24ee0ba087a31037084463ba23bee2d5d3e37d5f355f749d5/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 10 06:03:35 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7467cedc528fc4d24ee0ba087a31037084463ba23bee2d5d3e37d5f355f749d5/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 10 06:03:35 np0005479823 systemd[1]: Started /usr/bin/podman healthcheck run 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c.
Oct 10 06:03:35 np0005479823 podman[217162]: 2025-10-10 10:03:35.270851204 +0000 UTC m=+0.116284925 container init 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, managed_by=edpm_ansible)
Oct 10 06:03:35 np0005479823 multipathd[217177]: + sudo -E kolla_set_configs
Oct 10 06:03:35 np0005479823 podman[217162]: 2025-10-10 10:03:35.300557303 +0000 UTC m=+0.145991004 container start 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 10 06:03:35 np0005479823 podman[217162]: multipathd
Oct 10 06:03:35 np0005479823 systemd[1]: Started multipathd container.
Oct 10 06:03:35 np0005479823 multipathd[217177]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 10 06:03:35 np0005479823 multipathd[217177]: INFO:__main__:Validating config file
Oct 10 06:03:35 np0005479823 multipathd[217177]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 10 06:03:35 np0005479823 multipathd[217177]: INFO:__main__:Writing out command to execute
Oct 10 06:03:35 np0005479823 multipathd[217177]: ++ cat /run_command
Oct 10 06:03:35 np0005479823 multipathd[217177]: + CMD='/usr/sbin/multipathd -d'
Oct 10 06:03:35 np0005479823 multipathd[217177]: + ARGS=
Oct 10 06:03:35 np0005479823 multipathd[217177]: + sudo kolla_copy_cacerts
Oct 10 06:03:35 np0005479823 podman[217184]: 2025-10-10 10:03:35.360619961 +0000 UTC m=+0.051587899 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct 10 06:03:35 np0005479823 multipathd[217177]: + [[ ! -n '' ]]
Oct 10 06:03:35 np0005479823 multipathd[217177]: + . kolla_extend_start
Oct 10 06:03:35 np0005479823 multipathd[217177]: Running command: '/usr/sbin/multipathd -d'
Oct 10 06:03:35 np0005479823 multipathd[217177]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct 10 06:03:35 np0005479823 multipathd[217177]: + umask 0022
Oct 10 06:03:35 np0005479823 multipathd[217177]: + exec /usr/sbin/multipathd -d
Oct 10 06:03:35 np0005479823 systemd[1]: 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c-7000f04392a2e631.service: Main process exited, code=exited, status=1/FAILURE
Oct 10 06:03:35 np0005479823 systemd[1]: 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c-7000f04392a2e631.service: Failed with result 'exit-code'.
Oct 10 06:03:35 np0005479823 multipathd[217177]: 3518.991339 | --------start up--------
Oct 10 06:03:35 np0005479823 multipathd[217177]: 3518.991354 | read /etc/multipath.conf
Oct 10 06:03:35 np0005479823 multipathd[217177]: 3518.996783 | path checkers start up
Oct 10 06:03:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:03:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:35.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:03:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:36 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001a70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:36 np0005479823 python3.9[217366]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 06:03:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:36 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04004150 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:36 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:37.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:37 np0005479823 python3.9[217522]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 06:03:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:37 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:03:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:37 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:03:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:37.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:38 np0005479823 python3.9[217687]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 06:03:38 np0005479823 systemd[1]: Stopping multipathd container...
Oct 10 06:03:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:38 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:38 np0005479823 multipathd[217177]: 3522.179430 | exit (signal)
Oct 10 06:03:38 np0005479823 multipathd[217177]: 3522.179485 | --------shut down-------
Oct 10 06:03:38 np0005479823 systemd[1]: libpod-3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c.scope: Deactivated successfully.
Oct 10 06:03:38 np0005479823 podman[217691]: 2025-10-10 10:03:38.599226989 +0000 UTC m=+0.080461141 container died 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 06:03:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:38 np0005479823 systemd[1]: 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c-7000f04392a2e631.timer: Deactivated successfully.
Oct 10 06:03:38 np0005479823 systemd[1]: Stopped /usr/bin/podman healthcheck run 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c.
Oct 10 06:03:38 np0005479823 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c-userdata-shm.mount: Deactivated successfully.
Oct 10 06:03:38 np0005479823 systemd[1]: var-lib-containers-storage-overlay-7467cedc528fc4d24ee0ba087a31037084463ba23bee2d5d3e37d5f355f749d5-merged.mount: Deactivated successfully.
Oct 10 06:03:38 np0005479823 podman[217708]: 2025-10-10 10:03:38.700208394 +0000 UTC m=+0.071615639 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 10 06:03:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:38 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001a70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:38 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04004170 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:38 np0005479823 podman[217691]: 2025-10-10 10:03:38.975629 +0000 UTC m=+0.456863142 container cleanup 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct 10 06:03:38 np0005479823 podman[217691]: multipathd
Oct 10 06:03:39 np0005479823 podman[217742]: multipathd
Oct 10 06:03:39 np0005479823 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Oct 10 06:03:39 np0005479823 systemd[1]: Stopped multipathd container.
Oct 10 06:03:39 np0005479823 systemd[1]: Starting multipathd container...
Oct 10 06:03:39 np0005479823 podman[217743]: 2025-10-10 10:03:39.118148342 +0000 UTC m=+0.101367278 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 10 06:03:39 np0005479823 systemd[1]: Started libcrun container.
Oct 10 06:03:39 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7467cedc528fc4d24ee0ba087a31037084463ba23bee2d5d3e37d5f355f749d5/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 10 06:03:39 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7467cedc528fc4d24ee0ba087a31037084463ba23bee2d5d3e37d5f355f749d5/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 10 06:03:39 np0005479823 systemd[1]: Started /usr/bin/podman healthcheck run 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c.
Oct 10 06:03:39 np0005479823 podman[217767]: 2025-10-10 10:03:39.215805541 +0000 UTC m=+0.121609515 container init 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2)
Oct 10 06:03:39 np0005479823 multipathd[217793]: + sudo -E kolla_set_configs
Oct 10 06:03:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:39.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:39 np0005479823 podman[217767]: 2025-10-10 10:03:39.244456716 +0000 UTC m=+0.150260670 container start 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct 10 06:03:39 np0005479823 podman[217767]: multipathd
Oct 10 06:03:39 np0005479823 systemd[1]: Started multipathd container.
Oct 10 06:03:39 np0005479823 multipathd[217793]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 10 06:03:39 np0005479823 multipathd[217793]: INFO:__main__:Validating config file
Oct 10 06:03:39 np0005479823 multipathd[217793]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 10 06:03:39 np0005479823 multipathd[217793]: INFO:__main__:Writing out command to execute
Oct 10 06:03:39 np0005479823 multipathd[217793]: ++ cat /run_command
Oct 10 06:03:39 np0005479823 multipathd[217793]: + CMD='/usr/sbin/multipathd -d'
Oct 10 06:03:39 np0005479823 multipathd[217793]: + ARGS=
Oct 10 06:03:39 np0005479823 multipathd[217793]: + sudo kolla_copy_cacerts
Oct 10 06:03:39 np0005479823 multipathd[217793]: + [[ ! -n '' ]]
Oct 10 06:03:39 np0005479823 multipathd[217793]: + . kolla_extend_start
Oct 10 06:03:39 np0005479823 multipathd[217793]: Running command: '/usr/sbin/multipathd -d'
Oct 10 06:03:39 np0005479823 multipathd[217793]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct 10 06:03:39 np0005479823 multipathd[217793]: + umask 0022
Oct 10 06:03:39 np0005479823 multipathd[217793]: + exec /usr/sbin/multipathd -d
Oct 10 06:03:39 np0005479823 podman[217800]: 2025-10-10 10:03:39.31594812 +0000 UTC m=+0.063461539 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct 10 06:03:39 np0005479823 systemd[1]: 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c-52b87c31bcfd2ac1.service: Main process exited, code=exited, status=1/FAILURE
Oct 10 06:03:39 np0005479823 systemd[1]: 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c-52b87c31bcfd2ac1.service: Failed with result 'exit-code'.
Oct 10 06:03:39 np0005479823 multipathd[217793]: 3522.939492 | --------start up--------
Oct 10 06:03:39 np0005479823 multipathd[217793]: 3522.939513 | read /etc/multipath.conf
Oct 10 06:03:39 np0005479823 multipathd[217793]: 3522.944947 | path checkers start up
Oct 10 06:03:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:39.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:40 np0005479823 python3.9[217987]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:03:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:03:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:40 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04004170 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:40 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 06:03:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:40 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:40 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:41 np0005479823 python3.9[218141]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 10 06:03:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:41.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:03:41.453 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:03:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:03:41.454 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:03:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:03:41.454 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:03:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:41.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:41 np0005479823 python3.9[218293]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Oct 10 06:03:41 np0005479823 kernel: Key type psk registered
Oct 10 06:03:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:42 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04004170 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:42 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:42 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:42 np0005479823 python3.9[218457]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:03:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:43.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:43 np0005479823 python3.9[218581]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1760090622.3625395-2716-97335423636636/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:03:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:03:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:43.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:03:44 np0005479823 python3.9[218733]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:03:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:44 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:44 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04004190 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:44 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:03:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:45.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:45 np0005479823 python3.9[218887]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 06:03:45 np0005479823 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct 10 06:03:45 np0005479823 systemd[1]: Stopped Load Kernel Modules.
Oct 10 06:03:45 np0005479823 systemd[1]: Stopping Load Kernel Modules...
Oct 10 06:03:45 np0005479823 systemd[1]: Starting Load Kernel Modules...
Oct 10 06:03:45 np0005479823 systemd[1]: Finished Load Kernel Modules.
Oct 10 06:03:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:03:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:45.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:03:46 np0005479823 python3.9[219043]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 10 06:03:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:46 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100346 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 06:03:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:46 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:46 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d040041b0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:47 np0005479823 podman[219101]: 2025-10-10 10:03:47.232570677 +0000 UTC m=+0.054951895 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct 10 06:03:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:03:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:47.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:03:47 np0005479823 python3.9[219147]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 10 06:03:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:47.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:48 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8004380 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:48 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:48 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cfc001c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:49.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:03:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:49.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:03:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:03:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:50 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d040041d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:50 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf8004380 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:50 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d14001c50 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:51.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:51.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:52 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:52 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d040041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:52 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8cf0001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:03:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:53.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:03:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:53 np0005479823 systemd[1]: Reloading.
Oct 10 06:03:53 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 06:03:53 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 06:03:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:53.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:54 np0005479823 systemd[1]: Reloading.
Oct 10 06:03:54 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 06:03:54 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 06:03:54 np0005479823 systemd-logind[796]: Watching system buttons on /dev/input/event0 (Power Button)
Oct 10 06:03:54 np0005479823 systemd-logind[796]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct 10 06:03:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:54 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d14001c50 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:54 np0005479823 lvm[219268]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Oct 10 06:03:54 np0005479823 lvm[219268]: VG ceph_vg0 finished
Oct 10 06:03:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:54 np0005479823 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 10 06:03:54 np0005479823 systemd[1]: Starting man-db-cache-update.service...
Oct 10 06:03:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:54 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d1c004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:03:54 np0005479823 systemd[1]: Reloading.
Oct 10 06:03:54 np0005479823 kernel: ganesha.nfsd[219154]: segfault at 50 ip 00007f8dd2c3532e sp 00007f8d9dffa210 error 4 in libntirpc.so.5.8[7f8dd2c1a000+2c000] likely on CPU 4 (core 0, socket 4)
Oct 10 06:03:54 np0005479823 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Oct 10 06:03:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[187747]: 10/10/2025 10:03:54 : epoch 68e8d93a : compute-2 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8d04004210 fd 42 proxy ignored for local
Oct 10 06:03:54 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 06:03:54 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 06:03:55 np0005479823 systemd[1]: Started Process Core Dump (PID 219297/UID 0).
Oct 10 06:03:55 np0005479823 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 10 06:03:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:03:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:55.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:55.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:56 np0005479823 systemd-coredump[219535]: Process 187769 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 67:#012#0  0x00007f8dd2c3532e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Oct 10 06:03:56 np0005479823 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 10 06:03:56 np0005479823 systemd[1]: Finished man-db-cache-update.service.
Oct 10 06:03:56 np0005479823 systemd[1]: man-db-cache-update.service: Consumed 1.522s CPU time.
Oct 10 06:03:56 np0005479823 systemd[1]: run-r047650fce260480c8926b7504c6982d4.service: Deactivated successfully.
Oct 10 06:03:56 np0005479823 systemd[1]: systemd-coredump@6-219297-0.service: Deactivated successfully.
Oct 10 06:03:56 np0005479823 systemd[1]: systemd-coredump@6-219297-0.service: Consumed 1.110s CPU time.
Oct 10 06:03:56 np0005479823 podman[220517]: 2025-10-10 10:03:56.335061921 +0000 UTC m=+0.025454694 container died d2ad8466cb615ea6f40a71e4248a655d65b9d3319f4ec44bced7edb4eae94628 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Oct 10 06:03:56 np0005479823 systemd[1]: var-lib-containers-storage-overlay-3d1140c1c832a7cbf54fd0203a6ee559ca50a4b6de6ddc3b0879e0b1307a09df-merged.mount: Deactivated successfully.
Oct 10 06:03:56 np0005479823 podman[220517]: 2025-10-10 10:03:56.454235907 +0000 UTC m=+0.144628660 container remove d2ad8466cb615ea6f40a71e4248a655d65b9d3319f4ec44bced7edb4eae94628 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Oct 10 06:03:56 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Main process exited, code=exited, status=139/n/a
Oct 10 06:03:56 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Failed with result 'exit-code'.
Oct 10 06:03:56 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.635s CPU time.
Oct 10 06:03:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:57.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:57 np0005479823 python3.9[220690]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.iscsid_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:03:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:57.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:58 np0005479823 python3.9[220840]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 10 06:03:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:03:59.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:03:59 np0005479823 python3.9[220998]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:03:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:03:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:03:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:03:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:03:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:03:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:03:59.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:04:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100400 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 2ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 06:04:00 np0005479823 python3.9[221151]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 10 06:04:00 np0005479823 systemd[1]: Reloading.
Oct 10 06:04:01 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 06:04:01 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 06:04:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:01.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:04:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:01.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:04:02 np0005479823 python3.9[221337]: ansible-ansible.builtin.service_facts Invoked
Oct 10 06:04:02 np0005479823 network[221354]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 10 06:04:02 np0005479823 network[221355]: 'network-scripts' will be removed from distribution in near future.
Oct 10 06:04:02 np0005479823 network[221356]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 10 06:04:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:03.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:03.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:04:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:05.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:05.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:06 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Scheduled restart job, restart counter is at 7.
Oct 10 06:04:06 np0005479823 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 06:04:06 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.635s CPU time.
Oct 10 06:04:06 np0005479823 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 06:04:06 np0005479823 podman[221525]: 2025-10-10 10:04:06.861023148 +0000 UTC m=+0.042847910 container create 4d2a7c9961c6ba6f3729e097fc77944ca60dcabd2592c9e458b8850033b98ec1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.40.1)
Oct 10 06:04:06 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed46e148b9f8eeed63d17e29d76b4f2a4e379a6bb6a82713a88a44ea921fa1d2/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Oct 10 06:04:06 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed46e148b9f8eeed63d17e29d76b4f2a4e379a6bb6a82713a88a44ea921fa1d2/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 06:04:06 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed46e148b9f8eeed63d17e29d76b4f2a4e379a6bb6a82713a88a44ea921fa1d2/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 06:04:06 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed46e148b9f8eeed63d17e29d76b4f2a4e379a6bb6a82713a88a44ea921fa1d2/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.boccfy-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 06:04:06 np0005479823 podman[221525]: 2025-10-10 10:04:06.923915316 +0000 UTC m=+0.105740068 container init 4d2a7c9961c6ba6f3729e097fc77944ca60dcabd2592c9e458b8850033b98ec1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Oct 10 06:04:06 np0005479823 podman[221525]: 2025-10-10 10:04:06.839327235 +0000 UTC m=+0.021152017 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 06:04:06 np0005479823 podman[221525]: 2025-10-10 10:04:06.938299456 +0000 UTC m=+0.120124208 container start 4d2a7c9961c6ba6f3729e097fc77944ca60dcabd2592c9e458b8850033b98ec1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 06:04:06 np0005479823 bash[221525]: 4d2a7c9961c6ba6f3729e097fc77944ca60dcabd2592c9e458b8850033b98ec1
Oct 10 06:04:06 np0005479823 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 06:04:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:06 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Oct 10 06:04:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:06 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Oct 10 06:04:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:06 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Oct 10 06:04:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:06 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Oct 10 06:04:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:06 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Oct 10 06:04:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:06 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Oct 10 06:04:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:07 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Oct 10 06:04:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:07 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:04:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:07.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:07.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:09 np0005479823 podman[221744]: 2025-10-10 10:04:09.086767616 +0000 UTC m=+0.083155998 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct 10 06:04:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:09.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:09 np0005479823 python3.9[221745]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 06:04:09 np0005479823 podman[221768]: 2025-10-10 10:04:09.424079889 +0000 UTC m=+0.091107441 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 10 06:04:09 np0005479823 podman[221769]: 2025-10-10 10:04:09.431631411 +0000 UTC m=+0.094298874 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 10 06:04:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:04:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:09.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:04:10 np0005479823 python3.9[221963]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 06:04:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:04:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:10 np0005479823 python3.9[222116]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 06:04:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:04:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:11.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:04:11 np0005479823 python3.9[222271]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 06:04:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:11.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:12 np0005479823 python3.9[222424]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 06:04:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:12 np0005479823 python3.9[222578]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 06:04:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:13 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:04:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:13 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:04:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:13.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:13 np0005479823 python3.9[222732]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 06:04:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:13.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:04:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:15.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:15 np0005479823 python3.9[222887]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 06:04:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:15.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:16 np0005479823 python3.9[223065]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:04:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:17 np0005479823 python3.9[223219]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:04:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:04:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:17.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:04:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:17 np0005479823 podman[223343]: 2025-10-10 10:04:17.689310661 +0000 UTC m=+0.063879301 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct 10 06:04:17 np0005479823 python3.9[223390]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:04:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:04:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:17.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:04:18 np0005479823 python3.9[223542]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:04:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 06:04:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Oct 10 06:04:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Oct 10 06:04:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Oct 10 06:04:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Oct 10 06:04:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Oct 10 06:04:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Oct 10 06:04:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 06:04:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 06:04:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 06:04:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Oct 10 06:04:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 06:04:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Oct 10 06:04:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Oct 10 06:04:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Oct 10 06:04:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Oct 10 06:04:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Oct 10 06:04:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Oct 10 06:04:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Oct 10 06:04:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Oct 10 06:04:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Oct 10 06:04:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Oct 10 06:04:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Oct 10 06:04:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Oct 10 06:04:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 06:04:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Oct 10 06:04:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:19 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 06:04:19 np0005479823 python3.9[223696]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:04:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:04:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:19.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:04:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:19 np0005479823 python3.9[223860]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:04:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:19.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:04:20 np0005479823 python3.9[224012]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:04:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:20 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa86c000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:20 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:20 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:21 np0005479823 python3.9[224169]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:04:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:04:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:21.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:04:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:21.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:22 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:22 np0005479823 python3.9[224321]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:04:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:22 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa850000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100422 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 06:04:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:22 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:23 np0005479823 python3.9[224475]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:04:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:04:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:23.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:04:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:23 np0005479823 python3.9[224627]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:04:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:23.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:24 np0005479823 python3.9[224779]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:04:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:24 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa8480016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:24 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa8440016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:24 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa850001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:24 np0005479823 python3.9[224932]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:04:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:04:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:04:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:25.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:04:25 np0005479823 python3.9[225085]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:04:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:25.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:25 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Oct 10 06:04:25 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:25.997193) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 06:04:25 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Oct 10 06:04:25 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090665997265, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 935, "num_deletes": 251, "total_data_size": 2010408, "memory_usage": 2039184, "flush_reason": "Manual Compaction"}
Oct 10 06:04:25 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090666012198, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 1327534, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 19913, "largest_seqno": 20842, "table_properties": {"data_size": 1323321, "index_size": 1929, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 9372, "raw_average_key_size": 19, "raw_value_size": 1314853, "raw_average_value_size": 2727, "num_data_blocks": 86, "num_entries": 482, "num_filter_entries": 482, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760090593, "oldest_key_time": 1760090593, "file_creation_time": 1760090665, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 15059 microseconds, and 6369 cpu microseconds.
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.012260) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 1327534 bytes OK
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.012285) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.014497) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.014518) EVENT_LOG_v1 {"time_micros": 1760090666014512, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.014535) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 2005736, prev total WAL file size 2005736, number of live WAL files 2.
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.015163) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031323535' seq:72057594037927935, type:22 .. '7061786F730031353037' seq:0, type:0; will stop at (end)
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(1296KB)], [36(13MB)]
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090666015231, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 15030296, "oldest_snapshot_seqno": -1}
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 4970 keys, 12865378 bytes, temperature: kUnknown
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090666092334, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 12865378, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12831358, "index_size": 20470, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12485, "raw_key_size": 126819, "raw_average_key_size": 25, "raw_value_size": 12740312, "raw_average_value_size": 2563, "num_data_blocks": 839, "num_entries": 4970, "num_filter_entries": 4970, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760090666, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.092544) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 12865378 bytes
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.094789) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 194.8 rd, 166.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 13.1 +0.0 blob) out(12.3 +0.0 blob), read-write-amplify(21.0) write-amplify(9.7) OK, records in: 5486, records dropped: 516 output_compression: NoCompression
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.094805) EVENT_LOG_v1 {"time_micros": 1760090666094797, "job": 20, "event": "compaction_finished", "compaction_time_micros": 77163, "compaction_time_cpu_micros": 22472, "output_level": 6, "num_output_files": 1, "total_output_size": 12865378, "num_input_records": 5486, "num_output_records": 4970, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090666095092, "job": 20, "event": "table_file_deletion", "file_number": 38}
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090666097278, "job": 20, "event": "table_file_deletion", "file_number": 36}
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.015067) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.097325) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.097331) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.097332) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.097334) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:04:26 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:04:26.097335) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:04:26 np0005479823 python3.9[225237]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:04:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:26 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:26 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa8480016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:26 np0005479823 python3.9[225390]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:04:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:26 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa8440016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:27.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:27.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:28 np0005479823 python3.9[225543]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 06:04:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:28 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa850001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:28 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:28 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa8480016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:29.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:29 np0005479823 python3.9[225697]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 10 06:04:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:04:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:29.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:04:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:04:30 np0005479823 python3.9[225849]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 10 06:04:30 np0005479823 systemd[1]: Reloading.
Oct 10 06:04:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:30 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:30 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 06:04:30 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 06:04:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:30 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa8440016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:30 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa850002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:31.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:31 np0005479823 python3.9[226038]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 06:04:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:31.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:32 np0005479823 python3.9[226191]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 06:04:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:32 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:32 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:32 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:32 np0005479823 python3.9[226410]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 06:04:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:33.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:33 np0005479823 python3.9[226580]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 06:04:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100433 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 06:04:33 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:04:33 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:04:33 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:04:33 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:04:33 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 06:04:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:04:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:33.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:04:34 np0005479823 python3.9[226733]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 06:04:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100434 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 06:04:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:34 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa850002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:34 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:34 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:34 np0005479823 python3.9[226887]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 06:04:34 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:04:34 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:04:34 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 06:04:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:04:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:35.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:35 np0005479823 python3.9[227041]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 06:04:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:35.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:36 np0005479823 python3.9[227219]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 10 06:04:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:36 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:36 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa850002b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:36 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:37.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:04:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:37.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:04:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:38 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:38 np0005479823 python3.9[227375]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:04:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:38 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:38 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa850003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:39 np0005479823 podman[227523]: 2025-10-10 10:04:39.224128259 +0000 UTC m=+0.068142567 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 10 06:04:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:39.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:39 np0005479823 python3.9[227572]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:04:39 np0005479823 podman[227574]: 2025-10-10 10:04:39.543798069 +0000 UTC m=+0.061738042 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd)
Oct 10 06:04:39 np0005479823 podman[227575]: 2025-10-10 10:04:39.565969337 +0000 UTC m=+0.083498298 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct 10 06:04:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:39 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:04:39 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:04:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:39.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:40 np0005479823 python3.9[227772]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:04:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:04:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:40 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:40 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:40 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:41 np0005479823 python3.9[227926]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:04:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:04:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:41.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:04:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:04:41.455 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:04:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:04:41.457 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:04:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:04:41.457 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:04:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:41 np0005479823 python3.9[228078]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:04:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:41.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:42 np0005479823 python3.9[228230]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:04:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:42 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:04:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:42 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa850003c60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:42 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:42 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0016e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:43 np0005479823 python3.9[228384]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:04:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:43.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:43 np0005479823 python3.9[228536]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:04:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:43.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:44 np0005479823 python3.9[228688]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:04:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:44 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:44 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa850003c60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:44 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:44 np0005479823 python3.9[228841]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:04:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:04:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:45.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:45 np0005479823 python3.9[228994]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:04:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:45 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:04:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:45 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:04:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:45 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:04:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:45 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:04:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:45.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:46 np0005479823 python3.9[229146]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:04:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:46 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0016e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:46 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:46 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa850003c60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:47.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:47.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:48 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa850003c60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:48 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 06:04:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:48 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0037a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:48 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:48 np0005479823 podman[229174]: 2025-10-10 10:04:48.816144057 +0000 UTC m=+0.084476759 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct 10 06:04:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:04:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:49.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:04:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:49.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:04:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:50 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:50 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa850003c60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:50 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0037a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:51.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:51.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:52 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:52 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:52 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:53 np0005479823 python3.9[229327]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Oct 10 06:04:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:04:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:53.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:04:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100453 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 06:04:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:53.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:54 np0005479823 python3.9[229480]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 10 06:04:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100454 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 06:04:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:54 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0037a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:54 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868001ac0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:54 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:04:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:04:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:55.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:04:55 np0005479823 python3.9[229640]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 10 06:04:55 np0005479823 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 06:04:55 np0005479823 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 06:04:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:55.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:56 np0005479823 systemd-logind[796]: New session 56 of user zuul.
Oct 10 06:04:56 np0005479823 systemd[1]: Started Session 56 of User zuul.
Oct 10 06:04:56 np0005479823 ceph-osd[77423]: bluestore.MempoolThread fragmentation_score=0.000024 took=0.000092s
Oct 10 06:04:56 np0005479823 systemd[1]: session-56.scope: Deactivated successfully.
Oct 10 06:04:56 np0005479823 systemd-logind[796]: Session 56 logged out. Waiting for processes to exit.
Oct 10 06:04:56 np0005479823 systemd-logind[796]: Removed session 56.
Oct 10 06:04:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:56 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa8380016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:56 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0037a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:56 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa8680023e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:57 np0005479823 python3.9[229854]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:04:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:57.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:57 np0005479823 python3.9[229975]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090696.8281558-4354-184256217593510/.source.json follow=False _original_basename=config.json.j2 checksum=2c2474b5f24ef7c9ed37f49680082593e0d1100b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:04:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:57.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:58 np0005479823 python3.9[230125]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:04:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:58 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:58 np0005479823 python3.9[230201]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:04:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:58 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:04:58 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0037a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:04:59 np0005479823 python3.9[230353]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:04:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:04:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:04:59.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:04:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:04:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:04:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:04:59 np0005479823 python3.9[230474]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090698.8020444-4354-57649100125623/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:04:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:04:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:04:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:04:59.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:05:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:05:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:00 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa8680023e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:00 np0005479823 python3.9[230624]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:05:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:00 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:00 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838001fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:01 np0005479823 python3.9[230747]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090700.17484-4354-213474519980429/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=d01cc1b48d783e4ed08d12bb4d0a107aba230a69 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:05:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:01.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:01 np0005479823 python3.9[230897]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:05:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:01.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:02 np0005479823 python3.9[231018]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090701.4614491-4354-39984000135181/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:05:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:02 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0037c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:02 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa8680030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:02 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:05:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:03.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:05:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:03.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:04 np0005479823 python3.9[231172]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:05:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:04 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838001fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:04 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0037e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:04 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa8680030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:05:05 np0005479823 python3.9[231326]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:05:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:05.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:05.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:06 np0005479823 python3.9[231478]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 06:05:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:06 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:06 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838001fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:06 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003800 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:06 np0005479823 python3.9[231631]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:05:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:07.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:07 np0005479823 python3.9[231755]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1760090706.400211-4633-271045842952785/.source _original_basename=.ktnnso3a follow=False checksum=19921791aaa0ec1498e105f7ca2c9a3a0c3d4795 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Oct 10 06:05:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:07.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:08 np0005479823 python3.9[231907]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 06:05:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:08 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:08 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa844003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:08 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa8380032f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:09.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:09 np0005479823 python3.9[232061]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:05:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:09 np0005479823 podman[232158]: 2025-10-10 10:05:09.789948147 +0000 UTC m=+0.054680408 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 10 06:05:09 np0005479823 podman[232156]: 2025-10-10 10:05:09.791151755 +0000 UTC m=+0.065034658 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 10 06:05:09 np0005479823 podman[232157]: 2025-10-10 10:05:09.817692313 +0000 UTC m=+0.085793202 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 10 06:05:09 np0005479823 python3.9[232231]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090708.9445035-4712-37263100983407/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=837ffd9c004e5987a2e117698c56827ebbfeb5b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:05:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:10.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:05:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:10 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:10 np0005479823 python3.9[232399]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 10 06:05:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:10 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:10 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:11 np0005479823 python3.9[232521]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1760090710.332985-4756-88156178199187/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=722ab36345f3375cbdcf911ce8f6e1a8083d7e59 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 10 06:05:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:11.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:12.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:12 np0005479823 python3.9[232673]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Oct 10 06:05:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:12 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa8380032f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:12 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:12 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:13.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:13 np0005479823 python3.9[232827]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 10 06:05:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:14.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:14 np0005479823 python3[232979]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Oct 10 06:05:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:14 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:14 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:14 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:05:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:15.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:16.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:16 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:16 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:16 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:17.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:18.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:18 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:18 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:18 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:05:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:19.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:05:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:19 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 06:05:19 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 3941 writes, 21K keys, 3941 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.05 MB/s#012Cumulative WAL: 3941 writes, 3941 syncs, 1.00 writes per sync, written: 0.05 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1443 writes, 6884 keys, 1443 commit groups, 1.0 writes per commit group, ingest: 16.49 MB, 0.03 MB/s#012Interval WAL: 1443 writes, 1443 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    132.0      0.24              0.08        10    0.024       0      0       0.0       0.0#012  L6      1/0   12.27 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.5    168.2    142.7      0.78              0.28         9    0.087     43K   4823       0.0       0.0#012 Sum      1/0   12.27 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.5    128.3    140.2      1.03              0.37        19    0.054     43K   4823       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   5.4    120.9    121.2      0.52              0.13         8    0.065     22K   2562       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    168.2    142.7      0.78              0.28         9    0.087     43K   4823       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    133.0      0.24              0.08         9    0.027       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.031, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.14 GB write, 0.12 MB/s write, 0.13 GB read, 0.11 MB/s read, 1.0 seconds#012Interval compaction: 0.06 GB write, 0.11 MB/s write, 0.06 GB read, 0.11 MB/s read, 0.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x56161a963350#2 capacity: 304.00 MB usage: 8.78 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 9.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(481,8.41 MB,2.76774%) FilterBlock(19,130.05 KB,0.041776%) IndexBlock(19,240.70 KB,0.0773229%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Oct 10 06:05:19 np0005479823 podman[233063]: 2025-10-10 10:05:19.910721718 +0000 UTC m=+0.186692046 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct 10 06:05:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:20.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:05:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:20 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:20 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:20 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c0016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:21.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:05:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:22.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:05:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:22 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:22 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:22 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:23.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:24.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:24 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c0016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:24 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:24 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:25 np0005479823 podman[232992]: 2025-10-10 10:05:25.062605743 +0000 UTC m=+10.593652548 image pull 7ac362f4e836cf46e10a309acb4abf774df9481a1d6404c213437495cfb42f5d quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844
Oct 10 06:05:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:05:25 np0005479823 podman[233138]: 2025-10-10 10:05:25.173499911 +0000 UTC m=+0.021411925 image pull 7ac362f4e836cf46e10a309acb4abf774df9481a1d6404c213437495cfb42f5d quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844
Oct 10 06:05:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:25.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:26.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:26 np0005479823 podman[233138]: 2025-10-10 10:05:26.135621926 +0000 UTC m=+0.983533920 container create 95a19b4fa8f1397f8b1735f74117bd5512d5516b425caa220dcdff41907f909e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute_init, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_managed=true)
Oct 10 06:05:26 np0005479823 python3[232979]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844 bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Oct 10 06:05:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:26 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003840 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:26 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c0016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:26 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:27 np0005479823 python3.9[233330]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 06:05:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:27.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:28.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:28 np0005479823 python3.9[233484]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Oct 10 06:05:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:28 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:28 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c0039e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:28 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:29.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:29 np0005479823 python3.9[233638]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 10 06:05:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:30.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:05:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:30 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:30 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:30 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:30 np0005479823 python3[233791]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Oct 10 06:05:31 np0005479823 podman[233831]: 2025-10-10 10:05:31.065885394 +0000 UTC m=+0.055971877 container create a677c95a28d87d0fa9989f1bbc85251f20b3e9e51830cb4a7abc069cf607269f (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 06:05:31 np0005479823 podman[233831]: 2025-10-10 10:05:31.034522453 +0000 UTC m=+0.024609026 image pull 7ac362f4e836cf46e10a309acb4abf774df9481a1d6404c213437495cfb42f5d quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844
Oct 10 06:05:31 np0005479823 python3[233791]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844 kolla_start
Oct 10 06:05:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:05:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:31.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:05:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:32.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:32 np0005479823 python3.9[234020]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 06:05:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:32 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:32 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:32 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:32 np0005479823 python3.9[234175]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:05:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:33.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:33 np0005479823 python3.9[234327]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1760090733.0295084-5031-230863453702335/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 10 06:05:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:34.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:34 np0005479823 python3.9[234403]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 10 06:05:34 np0005479823 systemd[1]: Reloading.
Oct 10 06:05:34 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 06:05:34 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 06:05:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:34 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:34 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003d60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:34 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:35 np0005479823 python3.9[234515]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 10 06:05:35 np0005479823 systemd[1]: Reloading.
Oct 10 06:05:35 np0005479823 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 10 06:05:35 np0005479823 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 10 06:05:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:05:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Oct 10 06:05:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:35.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:35 np0005479823 systemd[1]: Starting nova_compute container...
Oct 10 06:05:35 np0005479823 systemd[1]: Started libcrun container.
Oct 10 06:05:35 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c42589750c71ce8f99ff19c16cbc797b53200611ad4b531f0f37c2c8d9ef3689/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 10 06:05:35 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c42589750c71ce8f99ff19c16cbc797b53200611ad4b531f0f37c2c8d9ef3689/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct 10 06:05:35 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c42589750c71ce8f99ff19c16cbc797b53200611ad4b531f0f37c2c8d9ef3689/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 10 06:05:35 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c42589750c71ce8f99ff19c16cbc797b53200611ad4b531f0f37c2c8d9ef3689/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 10 06:05:35 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c42589750c71ce8f99ff19c16cbc797b53200611ad4b531f0f37c2c8d9ef3689/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 10 06:05:35 np0005479823 podman[234556]: 2025-10-10 10:05:35.59512398 +0000 UTC m=+0.107480509 container init a677c95a28d87d0fa9989f1bbc85251f20b3e9e51830cb4a7abc069cf607269f (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 06:05:35 np0005479823 podman[234556]: 2025-10-10 10:05:35.603013551 +0000 UTC m=+0.115370050 container start a677c95a28d87d0fa9989f1bbc85251f20b3e9e51830cb4a7abc069cf607269f (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 06:05:35 np0005479823 podman[234556]: nova_compute
Oct 10 06:05:35 np0005479823 nova_compute[234571]: + sudo -E kolla_set_configs
Oct 10 06:05:35 np0005479823 systemd[1]: Started nova_compute container.
Oct 10 06:05:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Validating config file
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Copying service configuration files
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Deleting /etc/ceph
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Creating directory /etc/ceph
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Setting permission for /etc/ceph
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Writing out command to execute
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 10 06:05:35 np0005479823 nova_compute[234571]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 10 06:05:35 np0005479823 nova_compute[234571]: ++ cat /run_command
Oct 10 06:05:35 np0005479823 nova_compute[234571]: + CMD=nova-compute
Oct 10 06:05:35 np0005479823 nova_compute[234571]: + ARGS=
Oct 10 06:05:35 np0005479823 nova_compute[234571]: + sudo kolla_copy_cacerts
Oct 10 06:05:35 np0005479823 nova_compute[234571]: + [[ ! -n '' ]]
Oct 10 06:05:35 np0005479823 nova_compute[234571]: + . kolla_extend_start
Oct 10 06:05:35 np0005479823 nova_compute[234571]: + echo 'Running command: '\''nova-compute'\'''
Oct 10 06:05:35 np0005479823 nova_compute[234571]: Running command: 'nova-compute'
Oct 10 06:05:35 np0005479823 nova_compute[234571]: + umask 0022
Oct 10 06:05:35 np0005479823 nova_compute[234571]: + exec nova-compute
Oct 10 06:05:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:05:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:36.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:05:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:36 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868003e00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:36 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:36 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003d80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:37.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:37 np0005479823 python3.9[234760]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 06:05:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:37 np0005479823 nova_compute[234571]: 2025-10-10 10:05:37.907 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct 10 06:05:37 np0005479823 nova_compute[234571]: 2025-10-10 10:05:37.908 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct 10 06:05:37 np0005479823 nova_compute[234571]: 2025-10-10 10:05:37.908 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct 10 06:05:37 np0005479823 nova_compute[234571]: 2025-10-10 10:05:37.908 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Oct 10 06:05:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:38.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.061 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.077 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.016s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:05:38 np0005479823 python3.9[234913]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 06:05:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:38 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.790 2 INFO nova.virt.driver [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Oct 10 06:05:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:38 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868004b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:38 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.932 2 INFO nova.compute.provider_config [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.982 2 DEBUG oslo_concurrency.lockutils [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.982 2 DEBUG oslo_concurrency.lockutils [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.983 2 DEBUG oslo_concurrency.lockutils [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.983 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.983 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.983 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.984 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.984 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.984 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.984 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.984 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.984 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.985 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.985 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.985 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.985 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.985 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.985 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.986 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.986 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.986 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.986 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.986 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.987 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.987 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.987 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.987 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.987 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.988 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.988 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.988 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.988 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.988 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.989 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.989 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.989 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.989 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.989 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.990 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.990 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.990 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.990 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.990 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.990 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.991 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.991 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.991 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.991 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.992 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.992 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.992 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.992 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.992 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.992 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.993 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.993 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.993 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.993 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.993 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.994 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.994 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.994 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.994 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.994 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.994 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.995 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.995 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.995 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.995 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.995 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.996 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.996 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.996 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.996 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.996 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.997 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.997 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.997 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.997 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.997 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.998 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.998 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.998 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.998 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.999 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.999 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.999 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:38 np0005479823 nova_compute[234571]: 2025-10-10 10:05:38.999 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.000 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.000 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.000 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.000 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.001 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.001 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.001 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.001 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.002 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.002 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.002 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.002 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.003 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.003 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.003 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.003 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.004 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.004 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.004 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.004 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.004 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.005 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.005 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.005 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.005 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.005 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.006 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.006 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.006 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.006 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.006 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.007 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.007 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.007 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.007 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.007 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.008 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.008 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.008 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.008 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.008 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.008 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.009 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.009 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.009 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.009 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.010 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.010 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.010 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.010 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.010 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.011 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.011 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.011 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.011 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.011 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.012 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.012 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.012 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.012 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.012 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.013 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.013 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.013 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.013 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.013 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.014 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.014 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.014 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.014 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.014 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.015 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.015 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.015 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.015 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.016 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.016 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.016 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.016 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.016 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.017 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.017 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.017 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.017 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.017 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.018 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.018 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.018 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.018 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.019 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.019 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.019 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.019 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.020 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.020 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.020 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.020 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.021 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.021 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.021 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.021 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.021 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.021 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.022 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.022 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.022 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.022 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.022 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.023 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.023 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.023 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.023 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.023 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.024 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.024 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.024 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.024 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.024 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.025 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.025 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.025 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.025 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.026 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.026 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.026 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.026 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.026 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.027 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.027 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.027 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.027 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.027 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.028 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.028 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.028 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.028 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.028 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.028 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.029 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.029 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.029 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.029 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.029 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.030 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.030 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.030 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.030 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.030 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.030 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.031 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.031 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.031 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.031 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.031 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.032 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.032 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.032 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.032 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.033 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.033 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.033 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.033 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.033 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.033 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.034 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.034 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.034 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.034 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.034 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.035 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.035 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.035 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.035 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.035 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.035 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.036 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.036 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.036 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.036 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.036 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.037 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.037 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.037 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.037 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.037 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.038 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.038 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.038 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.038 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.038 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.038 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.039 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.039 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.039 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.039 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.039 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.040 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.040 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.040 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.040 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.041 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.041 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.041 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.041 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.042 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.042 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.042 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.042 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.042 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.043 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.043 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.043 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.043 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.044 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.044 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.044 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.044 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.044 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.045 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.045 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.045 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.045 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.045 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.045 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.046 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.046 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.046 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.046 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.046 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.047 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.047 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.047 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.047 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.047 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.048 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.048 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.048 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.048 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.048 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.048 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.049 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.049 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.049 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.049 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.049 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.050 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.050 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.050 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.050 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.050 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.050 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.051 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.051 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.051 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.051 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.052 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.052 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.052 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.052 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.052 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.053 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.053 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.053 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.053 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.054 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.054 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.054 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.054 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.054 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.055 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.055 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.055 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.055 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.055 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.056 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.056 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.056 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.056 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.056 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.057 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.057 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.057 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.057 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.057 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.058 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.058 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.058 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.058 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.058 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.059 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.059 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.059 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.059 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.059 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.060 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.060 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.060 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.060 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.061 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.061 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.061 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.061 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.061 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.062 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.062 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.062 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.062 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.062 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.063 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.063 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.063 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.063 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.063 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.064 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.064 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.064 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.064 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.064 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.065 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.065 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.065 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.065 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.065 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.066 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.066 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.066 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.066 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.067 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.067 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.067 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.067 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.068 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.068 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.068 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.068 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.068 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.068 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.069 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.069 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.069 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.069 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.070 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.070 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.070 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.070 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.071 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.071 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.071 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.071 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.071 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.072 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.072 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.072 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.072 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.073 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.073 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.073 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.073 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.073 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.073 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.074 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.074 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.074 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.074 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.074 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.075 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.075 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.075 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.075 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.076 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.076 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.076 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.076 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.077 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.077 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.077 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.077 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.077 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.078 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.078 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.078 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.078 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.078 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.079 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.079 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.079 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.079 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.079 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.080 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.080 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.080 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.080 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.080 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.081 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.081 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.081 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.081 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.081 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.082 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.082 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.082 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.082 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.082 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.083 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.083 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.083 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.083 2 WARNING oslo_config.cfg [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct 10 06:05:39 np0005479823 nova_compute[234571]: live_migration_uri is deprecated for removal in favor of two other options that
Oct 10 06:05:39 np0005479823 nova_compute[234571]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct 10 06:05:39 np0005479823 nova_compute[234571]: and ``live_migration_inbound_addr`` respectively.
Oct 10 06:05:39 np0005479823 nova_compute[234571]: ).  Its value may be silently ignored in the future.#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.084 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.084 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.084 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.084 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.085 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.085 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.085 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.085 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.086 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.086 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.086 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.086 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.087 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.087 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.087 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.087 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.088 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.088 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.088 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.rbd_secret_uuid        = 21f084a3-af34-5230-afe4-ea5cd24a55f4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.088 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.088 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.089 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.089 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.089 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.089 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.089 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.090 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.090 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.090 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.090 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.090 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.091 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.091 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.091 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.091 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.091 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.092 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.092 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.092 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.092 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.092 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.093 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.093 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.093 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.093 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.094 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.094 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.094 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.094 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.094 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.094 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.095 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.095 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.095 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.095 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.095 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.096 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.096 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.096 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.096 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.096 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.096 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.097 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.097 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.097 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.097 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.097 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.098 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.098 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.098 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.098 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.099 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.099 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.099 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.099 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.099 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.100 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.100 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.100 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.100 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.100 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.101 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.101 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.101 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.101 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.101 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.101 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.102 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.102 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.102 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.102 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.102 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.103 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.103 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.103 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.103 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.104 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.104 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.104 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.104 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.104 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.105 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.105 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.105 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.105 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.106 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.106 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.106 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.106 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.106 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.107 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.107 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.107 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.107 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.108 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.108 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.108 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.108 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.108 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.108 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.109 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.109 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.109 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.109 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.109 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.110 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.110 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.110 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.110 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.110 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.111 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.111 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.111 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.111 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.111 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.112 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.112 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.112 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.112 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.113 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.113 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.113 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.113 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.113 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.114 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.114 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.114 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.114 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.114 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.115 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.115 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.115 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.115 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.115 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.115 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.116 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.116 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.116 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.116 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.117 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.117 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.117 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.117 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.117 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.117 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.118 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.118 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.118 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.118 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.119 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.119 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.119 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.119 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.120 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.120 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.120 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.120 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.120 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.121 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.121 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.121 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.121 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.122 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.122 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.122 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.122 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.123 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.123 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.123 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.123 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.123 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.123 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.124 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.124 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.124 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.124 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.124 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.125 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.125 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.125 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.125 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.126 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.126 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.126 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.126 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.126 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.127 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.127 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.127 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.127 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.127 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.128 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.128 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.128 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.128 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.128 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.129 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.129 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.129 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.129 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.129 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.129 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.130 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.130 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.130 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.130 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.130 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.131 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.131 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.131 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.131 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.131 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.131 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.132 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.132 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.132 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.132 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.132 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.133 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.133 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.133 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.133 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.133 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.134 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.134 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.134 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.134 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.134 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.135 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.135 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.135 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.136 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.136 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.136 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.136 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.136 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.137 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.137 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.137 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.137 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.137 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.137 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.138 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.138 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.138 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.138 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.138 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.139 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.139 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.139 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.139 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.139 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.140 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.140 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.140 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.140 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.140 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.140 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.141 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.141 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.141 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.141 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.141 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.142 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.142 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.142 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.142 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.142 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.142 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.143 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.143 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.143 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.143 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.144 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.144 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.144 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.145 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.145 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.145 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.145 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.145 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.146 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.146 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.146 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.146 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.146 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.146 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.147 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.147 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.147 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.147 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.147 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.148 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.148 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.148 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.148 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.148 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.149 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.149 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.149 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.149 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.150 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.150 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.150 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.150 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.150 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.151 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.151 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.151 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.151 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.151 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.152 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.152 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.152 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.152 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.152 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.153 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.153 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.153 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.153 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.153 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.154 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.154 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.154 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.154 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.154 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.154 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.155 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.155 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.155 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.155 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.156 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.156 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.156 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.156 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.156 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.157 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.157 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.157 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.157 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.157 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.157 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.158 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.158 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.158 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.158 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.158 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.159 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.159 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.159 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.159 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.159 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.160 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.160 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.160 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.160 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.161 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.161 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.161 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.161 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.161 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.162 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.162 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.162 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.162 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.162 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.162 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.163 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.163 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.163 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.163 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.163 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.164 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.164 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.164 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.164 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.165 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.165 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.165 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.165 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.165 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.165 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.166 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.166 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.166 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.166 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.166 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.167 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.167 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.167 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.167 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.167 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.168 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.168 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.168 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.168 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.169 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.169 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.169 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.169 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.169 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.169 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.170 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.170 2 DEBUG oslo_service.service [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.171 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.196 2 DEBUG nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.197 2 DEBUG nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.197 2 DEBUG nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.197 2 DEBUG nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Oct 10 06:05:39 np0005479823 systemd[1]: Starting libvirt QEMU daemon...
Oct 10 06:05:39 np0005479823 systemd[1]: Started libvirt QEMU daemon.
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.287 2 DEBUG nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f46c545a5e0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.290 2 DEBUG nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f46c545a5e0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.291 2 INFO nova.virt.libvirt.driver [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Connection event '1' reason 'None'#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.315 2 WARNING nova.virt.libvirt.driver [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Cannot update service status on host "compute-2.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Oct 10 06:05:39 np0005479823 nova_compute[234571]: 2025-10-10 10:05:39.316 2 DEBUG nova.virt.libvirt.volume.mount [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Oct 10 06:05:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:39.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:39 np0005479823 python3.9[235066]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 10 06:05:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:39 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:05:39 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:05:39 np0005479823 podman[235266]: 2025-10-10 10:05:39.894072511 +0000 UTC m=+0.055685907 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible)
Oct 10 06:05:39 np0005479823 podman[235267]: 2025-10-10 10:05:39.922748765 +0000 UTC m=+0.082105759 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Oct 10 06:05:39 np0005479823 podman[235265]: 2025-10-10 10:05:39.935678678 +0000 UTC m=+0.095225698 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct 10 06:05:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:40.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:40 np0005479823 nova_compute[234571]: 2025-10-10 10:05:40.215 2 INFO nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Libvirt host capabilities <capabilities>
Oct 10 06:05:40 np0005479823 nova_compute[234571]: 
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <host>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <uuid>55d065af-0252-4401-ad6e-822a36bead06</uuid>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <cpu>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <arch>x86_64</arch>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model>EPYC-Rome-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <vendor>AMD</vendor>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <microcode version='16777317'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <signature family='23' model='49' stepping='0'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <maxphysaddr mode='emulate' bits='40'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature name='x2apic'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature name='tsc-deadline'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature name='osxsave'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature name='hypervisor'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature name='tsc_adjust'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature name='spec-ctrl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature name='stibp'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature name='arch-capabilities'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature name='ssbd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature name='cmp_legacy'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature name='topoext'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature name='virt-ssbd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature name='lbrv'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature name='tsc-scale'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature name='vmcb-clean'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature name='pause-filter'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature name='pfthreshold'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature name='svme-addr-chk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature name='rdctl-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature name='skip-l1dfl-vmentry'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature name='mds-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature name='pschange-mc-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <pages unit='KiB' size='4'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <pages unit='KiB' size='2048'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <pages unit='KiB' size='1048576'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </cpu>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <power_management>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <suspend_mem/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </power_management>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <iommu support='no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <migration_features>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <live/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <uri_transports>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <uri_transport>tcp</uri_transport>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <uri_transport>rdma</uri_transport>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </uri_transports>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </migration_features>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <topology>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <cells num='1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <cell id='0'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:          <memory unit='KiB'>7864356</memory>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:          <pages unit='KiB' size='4'>1966089</pages>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:          <pages unit='KiB' size='2048'>0</pages>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:          <pages unit='KiB' size='1048576'>0</pages>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:          <distances>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:            <sibling id='0' value='10'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:          </distances>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:          <cpus num='8'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:          </cpus>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        </cell>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </cells>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </topology>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <cache>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </cache>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <secmodel>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model>selinux</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <doi>0</doi>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </secmodel>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <secmodel>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model>dac</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <doi>0</doi>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <baselabel type='kvm'>+107:+107</baselabel>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <baselabel type='qemu'>+107:+107</baselabel>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </secmodel>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  </host>
Oct 10 06:05:40 np0005479823 nova_compute[234571]: 
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <guest>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <os_type>hvm</os_type>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <arch name='i686'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <wordsize>32</wordsize>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <domain type='qemu'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <domain type='kvm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </arch>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <features>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <pae/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <nonpae/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <acpi default='on' toggle='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <apic default='on' toggle='no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <cpuselection/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <deviceboot/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <disksnapshot default='on' toggle='no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <externalSnapshot/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </features>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  </guest>
Oct 10 06:05:40 np0005479823 nova_compute[234571]: 
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <guest>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <os_type>hvm</os_type>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <arch name='x86_64'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <wordsize>64</wordsize>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <domain type='qemu'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <domain type='kvm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </arch>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <features>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <acpi default='on' toggle='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <apic default='on' toggle='no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <cpuselection/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <deviceboot/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <disksnapshot default='on' toggle='no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <externalSnapshot/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </features>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  </guest>
Oct 10 06:05:40 np0005479823 nova_compute[234571]: 
Oct 10 06:05:40 np0005479823 nova_compute[234571]: </capabilities>
Oct 10 06:05:40 np0005479823 nova_compute[234571]: #033[00m
Oct 10 06:05:40 np0005479823 nova_compute[234571]: 2025-10-10 10:05:40.229 2 DEBUG nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct 10 06:05:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:05:40 np0005479823 nova_compute[234571]: 2025-10-10 10:05:40.259 2 DEBUG nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Oct 10 06:05:40 np0005479823 nova_compute[234571]: <domainCapabilities>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <path>/usr/libexec/qemu-kvm</path>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <domain>kvm</domain>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <machine>pc-q35-rhel9.6.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <arch>i686</arch>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <vcpu max='4096'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <iothreads supported='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <os supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <enum name='firmware'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <loader supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='type'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>rom</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>pflash</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='readonly'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>yes</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>no</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='secure'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>no</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </loader>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  </os>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <cpu>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <mode name='host-passthrough' supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='hostPassthroughMigratable'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>on</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>off</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </mode>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <mode name='maximum' supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='maximumMigratable'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>on</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>off</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </mode>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <mode name='host-model' supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model fallback='forbid'>EPYC-Rome</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <vendor>AMD</vendor>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='x2apic'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='tsc-deadline'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='hypervisor'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='tsc_adjust'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='spec-ctrl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='stibp'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='arch-capabilities'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='ssbd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='cmp_legacy'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='overflow-recov'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='succor'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='ibrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='amd-ssbd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='virt-ssbd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='lbrv'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='tsc-scale'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='vmcb-clean'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='flushbyasid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='pause-filter'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='pfthreshold'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='svme-addr-chk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='lfence-always-serializing'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='rdctl-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='mds-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='pschange-mc-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='gds-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='rfds-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='disable' name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </mode>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <mode name='custom' supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell-noTSX'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cascadelake-Server'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cascadelake-Server-noTSX'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cascadelake-Server-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cascadelake-Server-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cascadelake-Server-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cascadelake-Server-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cascadelake-Server-v5'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cooperlake'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cooperlake-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cooperlake-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Denverton'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mpx'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Denverton-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mpx'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Denverton-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Denverton-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Dhyana-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Genoa'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amd-psfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='auto-ibrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='no-nested-data-bp'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='null-sel-clr-base'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='stibp-always-on'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Genoa-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amd-psfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='auto-ibrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='no-nested-data-bp'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='null-sel-clr-base'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='stibp-always-on'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Milan'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Milan-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Milan-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amd-psfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='no-nested-data-bp'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='null-sel-clr-base'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='stibp-always-on'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Rome'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Rome-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Rome-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Rome-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='GraniteRapids'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-tile'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fbsdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrc'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fzrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mcdt-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pbrsb-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='prefetchiti'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='psdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='GraniteRapids-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-tile'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fbsdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrc'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fzrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mcdt-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pbrsb-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='prefetchiti'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='psdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='GraniteRapids-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-tile'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx10'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx10-128'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx10-256'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx10-512'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cldemote'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fbsdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrc'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fzrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mcdt-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdir64b'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdiri'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pbrsb-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='prefetchiti'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='psdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell-noTSX'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell-noTSX-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-noTSX'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-v5'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-v6'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-v7'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='IvyBridge'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='IvyBridge-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='IvyBridge-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='IvyBridge-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='KnightsMill'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-4fmaps'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-4vnniw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512er'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512pf'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='KnightsMill-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-4fmaps'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-4vnniw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512er'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512pf'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Opteron_G4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fma4'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xop'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Opteron_G4-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fma4'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xop'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Opteron_G5'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fma4'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tbm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xop'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Opteron_G5-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fma4'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tbm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xop'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='SapphireRapids'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-tile'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrc'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fzrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='SapphireRapids-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-tile'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrc'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fzrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='SapphireRapids-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-tile'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fbsdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrc'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fzrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='psdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='SapphireRapids-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-tile'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cldemote'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fbsdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrc'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fzrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdir64b'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdiri'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='psdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='SierraForest'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-ne-convert'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cmpccxadd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fbsdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mcdt-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pbrsb-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='psdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='SierraForest-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-ne-convert'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cmpccxadd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fbsdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mcdt-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pbrsb-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='psdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Client'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Client-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Client-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Client-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Client-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Client-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server-v5'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Snowridge'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cldemote'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='core-capability'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdir64b'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdiri'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mpx'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='split-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Snowridge-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cldemote'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='core-capability'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdir64b'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdiri'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mpx'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='split-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Snowridge-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cldemote'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='core-capability'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdir64b'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdiri'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='split-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Snowridge-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cldemote'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='core-capability'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdir64b'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdiri'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='split-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Snowridge-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cldemote'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdir64b'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdiri'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='athlon'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnow'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnowext'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='athlon-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnow'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnowext'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='core2duo'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='core2duo-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='coreduo'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='coreduo-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='n270'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='n270-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='phenom'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnow'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnowext'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='phenom-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnow'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnowext'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </mode>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  </cpu>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <memoryBacking supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <enum name='sourceType'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <value>file</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <value>anonymous</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <value>memfd</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  </memoryBacking>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <devices>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <disk supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='diskDevice'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>disk</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>cdrom</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>floppy</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>lun</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='bus'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>fdc</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>scsi</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>usb</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>sata</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='model'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio-transitional</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio-non-transitional</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </disk>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <graphics supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='type'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>vnc</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>egl-headless</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>dbus</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </graphics>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <video supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='modelType'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>vga</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>cirrus</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>none</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>bochs</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>ramfb</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </video>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <hostdev supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='mode'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>subsystem</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='startupPolicy'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>default</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>mandatory</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>requisite</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>optional</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='subsysType'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>usb</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>pci</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>scsi</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='capsType'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='pciBackend'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </hostdev>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <rng supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='model'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio-transitional</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio-non-transitional</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='backendModel'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>random</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>egd</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>builtin</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </rng>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <filesystem supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='driverType'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>path</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>handle</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtiofs</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </filesystem>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <tpm supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='model'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>tpm-tis</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>tpm-crb</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='backendModel'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>emulator</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>external</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='backendVersion'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>2.0</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </tpm>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <redirdev supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='bus'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>usb</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </redirdev>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <channel supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='type'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>pty</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>unix</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </channel>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <crypto supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='model'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='type'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>qemu</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='backendModel'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>builtin</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </crypto>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <interface supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='backendType'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>default</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>passt</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </interface>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <panic supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='model'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>isa</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>hyperv</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </panic>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  </devices>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <features>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <gic supported='no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <vmcoreinfo supported='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <genid supported='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <backingStoreInput supported='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <backup supported='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <async-teardown supported='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <ps2 supported='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <sev supported='no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <sgx supported='no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <hyperv supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='features'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>relaxed</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>vapic</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>spinlocks</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>vpindex</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>runtime</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>synic</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>stimer</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>reset</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>vendor_id</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>frequencies</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>reenlightenment</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>tlbflush</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>ipi</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>avic</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>emsr_bitmap</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>xmm_input</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </hyperv>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <launchSecurity supported='no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  </features>
Oct 10 06:05:40 np0005479823 nova_compute[234571]: </domainCapabilities>
Oct 10 06:05:40 np0005479823 nova_compute[234571]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct 10 06:05:40 np0005479823 nova_compute[234571]: 2025-10-10 10:05:40.268 2 DEBUG nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Oct 10 06:05:40 np0005479823 nova_compute[234571]: <domainCapabilities>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <path>/usr/libexec/qemu-kvm</path>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <domain>kvm</domain>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <machine>pc-i440fx-rhel7.6.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <arch>i686</arch>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <vcpu max='240'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <iothreads supported='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <os supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <enum name='firmware'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <loader supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='type'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>rom</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>pflash</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='readonly'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>yes</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>no</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='secure'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>no</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </loader>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  </os>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <cpu>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <mode name='host-passthrough' supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='hostPassthroughMigratable'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>on</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>off</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </mode>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <mode name='maximum' supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='maximumMigratable'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>on</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>off</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </mode>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <mode name='host-model' supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model fallback='forbid'>EPYC-Rome</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <vendor>AMD</vendor>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='x2apic'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='tsc-deadline'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='hypervisor'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='tsc_adjust'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='spec-ctrl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='stibp'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='arch-capabilities'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='ssbd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='cmp_legacy'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='overflow-recov'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='succor'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='ibrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='amd-ssbd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='virt-ssbd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='lbrv'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='tsc-scale'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='vmcb-clean'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='flushbyasid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='pause-filter'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='pfthreshold'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='svme-addr-chk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='lfence-always-serializing'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='rdctl-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='mds-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='pschange-mc-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='gds-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='rfds-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='disable' name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </mode>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <mode name='custom' supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell-noTSX'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cascadelake-Server'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cascadelake-Server-noTSX'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cascadelake-Server-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cascadelake-Server-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cascadelake-Server-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cascadelake-Server-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cascadelake-Server-v5'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cooperlake'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cooperlake-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cooperlake-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Denverton'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mpx'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Denverton-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mpx'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Denverton-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Denverton-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Dhyana-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Genoa'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amd-psfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='auto-ibrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='no-nested-data-bp'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='null-sel-clr-base'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='stibp-always-on'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Genoa-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amd-psfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='auto-ibrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='no-nested-data-bp'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='null-sel-clr-base'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='stibp-always-on'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Milan'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Milan-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Milan-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amd-psfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='no-nested-data-bp'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='null-sel-clr-base'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='stibp-always-on'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Rome'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Rome-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Rome-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Rome-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='GraniteRapids'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-tile'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fbsdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrc'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fzrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mcdt-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pbrsb-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='prefetchiti'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='psdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='GraniteRapids-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-tile'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fbsdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrc'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fzrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mcdt-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pbrsb-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='prefetchiti'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='psdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='GraniteRapids-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-tile'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx10'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx10-128'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx10-256'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx10-512'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cldemote'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fbsdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrc'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fzrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mcdt-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdir64b'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdiri'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pbrsb-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='prefetchiti'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='psdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell-noTSX'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell-noTSX-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-noTSX'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-v5'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-v6'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-v7'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='IvyBridge'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='IvyBridge-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='IvyBridge-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='IvyBridge-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='KnightsMill'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-4fmaps'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-4vnniw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512er'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512pf'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='KnightsMill-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-4fmaps'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-4vnniw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512er'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512pf'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Opteron_G4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fma4'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xop'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Opteron_G4-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fma4'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xop'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Opteron_G5'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fma4'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tbm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xop'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Opteron_G5-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fma4'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tbm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xop'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='SapphireRapids'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-tile'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrc'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fzrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='SapphireRapids-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-tile'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrc'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fzrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='SapphireRapids-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-tile'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fbsdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrc'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fzrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='psdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='SapphireRapids-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-tile'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cldemote'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fbsdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrc'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fzrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdir64b'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdiri'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='psdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='SierraForest'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-ne-convert'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cmpccxadd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fbsdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mcdt-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pbrsb-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='psdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='SierraForest-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-ne-convert'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cmpccxadd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fbsdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mcdt-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pbrsb-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='psdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Client'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Client-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Client-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Client-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Client-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Client-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server-v5'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Snowridge'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cldemote'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='core-capability'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdir64b'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdiri'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mpx'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='split-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Snowridge-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cldemote'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='core-capability'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdir64b'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdiri'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mpx'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='split-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Snowridge-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cldemote'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='core-capability'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdir64b'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdiri'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='split-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Snowridge-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cldemote'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='core-capability'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdir64b'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdiri'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='split-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Snowridge-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cldemote'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdir64b'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdiri'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='athlon'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnow'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnowext'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='athlon-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnow'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnowext'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='core2duo'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='core2duo-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='coreduo'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='coreduo-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='n270'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='n270-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='phenom'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnow'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnowext'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='phenom-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnow'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnowext'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </mode>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  </cpu>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <memoryBacking supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <enum name='sourceType'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <value>file</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <value>anonymous</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <value>memfd</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  </memoryBacking>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <devices>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <disk supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='diskDevice'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>disk</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>cdrom</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>floppy</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>lun</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='bus'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>ide</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>fdc</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>scsi</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>usb</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>sata</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='model'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio-transitional</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio-non-transitional</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </disk>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <graphics supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='type'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>vnc</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>egl-headless</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>dbus</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </graphics>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <video supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='modelType'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>vga</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>cirrus</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>none</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>bochs</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>ramfb</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </video>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <hostdev supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='mode'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>subsystem</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='startupPolicy'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>default</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>mandatory</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>requisite</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>optional</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='subsysType'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>usb</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>pci</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>scsi</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='capsType'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='pciBackend'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </hostdev>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <rng supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='model'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio-transitional</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio-non-transitional</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='backendModel'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>random</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>egd</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>builtin</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </rng>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <filesystem supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='driverType'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>path</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>handle</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtiofs</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </filesystem>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <tpm supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='model'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>tpm-tis</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>tpm-crb</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='backendModel'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>emulator</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>external</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='backendVersion'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>2.0</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </tpm>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <redirdev supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='bus'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>usb</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </redirdev>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <channel supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='type'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>pty</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>unix</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </channel>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <crypto supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='model'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='type'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>qemu</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='backendModel'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>builtin</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </crypto>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <interface supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='backendType'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>default</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>passt</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </interface>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <panic supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='model'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>isa</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>hyperv</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </panic>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  </devices>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <features>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <gic supported='no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <vmcoreinfo supported='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <genid supported='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <backingStoreInput supported='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <backup supported='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <async-teardown supported='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <ps2 supported='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <sev supported='no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <sgx supported='no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <hyperv supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='features'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>relaxed</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>vapic</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>spinlocks</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>vpindex</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>runtime</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>synic</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>stimer</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>reset</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>vendor_id</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>frequencies</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>reenlightenment</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>tlbflush</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>ipi</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>avic</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>emsr_bitmap</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>xmm_input</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </hyperv>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <launchSecurity supported='no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  </features>
Oct 10 06:05:40 np0005479823 nova_compute[234571]: </domainCapabilities>
Oct 10 06:05:40 np0005479823 nova_compute[234571]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct 10 06:05:40 np0005479823 nova_compute[234571]: 2025-10-10 10:05:40.288 2 DEBUG nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct 10 06:05:40 np0005479823 nova_compute[234571]: 2025-10-10 10:05:40.292 2 DEBUG nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Oct 10 06:05:40 np0005479823 nova_compute[234571]: <domainCapabilities>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <path>/usr/libexec/qemu-kvm</path>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <domain>kvm</domain>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <machine>pc-q35-rhel9.6.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <arch>x86_64</arch>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <vcpu max='4096'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <iothreads supported='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <os supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <enum name='firmware'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <value>efi</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <loader supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='type'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>rom</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>pflash</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='readonly'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>yes</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>no</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='secure'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>yes</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>no</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </loader>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  </os>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <cpu>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <mode name='host-passthrough' supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='hostPassthroughMigratable'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>on</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>off</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </mode>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <mode name='maximum' supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='maximumMigratable'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>on</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>off</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </mode>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <mode name='host-model' supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model fallback='forbid'>EPYC-Rome</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <vendor>AMD</vendor>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='x2apic'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='tsc-deadline'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='hypervisor'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='tsc_adjust'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='spec-ctrl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='stibp'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='arch-capabilities'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='ssbd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='cmp_legacy'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='overflow-recov'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='succor'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='ibrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='amd-ssbd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='virt-ssbd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='lbrv'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='tsc-scale'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='vmcb-clean'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='flushbyasid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='pause-filter'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='pfthreshold'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='svme-addr-chk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='lfence-always-serializing'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='rdctl-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='mds-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='pschange-mc-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='gds-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='rfds-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='disable' name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </mode>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <mode name='custom' supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell-noTSX'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cascadelake-Server'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cascadelake-Server-noTSX'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cascadelake-Server-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cascadelake-Server-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cascadelake-Server-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cascadelake-Server-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cascadelake-Server-v5'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cooperlake'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cooperlake-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cooperlake-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Denverton'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mpx'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Denverton-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mpx'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Denverton-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Denverton-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Dhyana-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Genoa'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amd-psfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='auto-ibrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='no-nested-data-bp'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='null-sel-clr-base'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='stibp-always-on'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Genoa-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amd-psfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='auto-ibrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='no-nested-data-bp'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='null-sel-clr-base'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='stibp-always-on'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Milan'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Milan-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Milan-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amd-psfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='no-nested-data-bp'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='null-sel-clr-base'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='stibp-always-on'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Rome'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Rome-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Rome-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Rome-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='GraniteRapids'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-tile'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fbsdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrc'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fzrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mcdt-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pbrsb-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='prefetchiti'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='psdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='GraniteRapids-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-tile'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fbsdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrc'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fzrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mcdt-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pbrsb-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='prefetchiti'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='psdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='GraniteRapids-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-tile'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx10'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx10-128'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx10-256'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx10-512'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cldemote'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fbsdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrc'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fzrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mcdt-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdir64b'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdiri'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pbrsb-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='prefetchiti'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='psdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell-noTSX'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell-noTSX-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-noTSX'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-v5'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-v6'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-v7'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='IvyBridge'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='IvyBridge-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='IvyBridge-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='IvyBridge-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='KnightsMill'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-4fmaps'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-4vnniw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512er'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512pf'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='KnightsMill-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-4fmaps'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-4vnniw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512er'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512pf'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Opteron_G4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fma4'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xop'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Opteron_G4-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fma4'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xop'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Opteron_G5'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fma4'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tbm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xop'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Opteron_G5-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fma4'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tbm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xop'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='SapphireRapids'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-tile'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrc'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fzrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='SapphireRapids-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-tile'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrc'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fzrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='SapphireRapids-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-tile'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fbsdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrc'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fzrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='psdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='SapphireRapids-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-tile'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cldemote'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fbsdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrc'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fzrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdir64b'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdiri'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='psdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='SierraForest'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-ne-convert'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cmpccxadd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fbsdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mcdt-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pbrsb-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='psdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='SierraForest-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-ne-convert'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cmpccxadd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fbsdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mcdt-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pbrsb-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='psdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Client'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Client-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Client-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Client-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Client-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Client-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server-v5'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Snowridge'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cldemote'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='core-capability'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdir64b'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdiri'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mpx'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='split-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Snowridge-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cldemote'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='core-capability'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdir64b'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdiri'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mpx'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='split-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Snowridge-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cldemote'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='core-capability'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdir64b'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdiri'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='split-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Snowridge-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cldemote'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='core-capability'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdir64b'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdiri'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='split-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Snowridge-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cldemote'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdir64b'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdiri'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='athlon'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnow'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnowext'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='athlon-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnow'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnowext'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='core2duo'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='core2duo-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='coreduo'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='coreduo-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='n270'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='n270-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='phenom'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnow'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnowext'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='phenom-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnow'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnowext'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </mode>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  </cpu>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <memoryBacking supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <enum name='sourceType'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <value>file</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <value>anonymous</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <value>memfd</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  </memoryBacking>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <devices>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <disk supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='diskDevice'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>disk</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>cdrom</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>floppy</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>lun</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='bus'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>fdc</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>scsi</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>usb</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>sata</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='model'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio-transitional</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio-non-transitional</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </disk>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <graphics supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='type'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>vnc</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>egl-headless</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>dbus</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </graphics>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <video supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='modelType'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>vga</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>cirrus</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>none</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>bochs</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>ramfb</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </video>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <hostdev supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='mode'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>subsystem</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='startupPolicy'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>default</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>mandatory</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>requisite</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>optional</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='subsysType'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>usb</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>pci</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>scsi</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='capsType'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='pciBackend'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </hostdev>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <rng supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='model'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio-transitional</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio-non-transitional</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='backendModel'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>random</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>egd</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>builtin</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </rng>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <filesystem supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='driverType'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>path</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>handle</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtiofs</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </filesystem>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <tpm supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='model'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>tpm-tis</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>tpm-crb</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='backendModel'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>emulator</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>external</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='backendVersion'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>2.0</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </tpm>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <redirdev supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='bus'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>usb</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </redirdev>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <channel supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='type'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>pty</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>unix</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </channel>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <crypto supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='model'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='type'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>qemu</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='backendModel'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>builtin</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </crypto>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <interface supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='backendType'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>default</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>passt</value>
Oct 10 06:05:40 np0005479823 python3.9[235476]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </interface>
Oct 10 06:05:40 np0005479823 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <panic supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='model'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>isa</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>hyperv</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </panic>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  </devices>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <features>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <gic supported='no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <vmcoreinfo supported='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <genid supported='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <backingStoreInput supported='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <backup supported='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <async-teardown supported='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <ps2 supported='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <sev supported='no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <sgx supported='no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <hyperv supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='features'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>relaxed</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>vapic</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>spinlocks</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>vpindex</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>runtime</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>synic</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>stimer</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>reset</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>vendor_id</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>frequencies</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>reenlightenment</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>tlbflush</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>ipi</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>avic</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>emsr_bitmap</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>xmm_input</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </hyperv>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <launchSecurity supported='no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  </features>
Oct 10 06:05:40 np0005479823 nova_compute[234571]: </domainCapabilities>
Oct 10 06:05:40 np0005479823 nova_compute[234571]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct 10 06:05:40 np0005479823 nova_compute[234571]: 2025-10-10 10:05:40.342 2 DEBUG nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Oct 10 06:05:40 np0005479823 nova_compute[234571]: <domainCapabilities>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <path>/usr/libexec/qemu-kvm</path>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <domain>kvm</domain>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <machine>pc-i440fx-rhel7.6.0</machine>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <arch>x86_64</arch>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <vcpu max='240'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <iothreads supported='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <os supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <enum name='firmware'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <loader supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='type'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>rom</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>pflash</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='readonly'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>yes</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>no</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='secure'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>no</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </loader>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  </os>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <cpu>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <mode name='host-passthrough' supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='hostPassthroughMigratable'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>on</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>off</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </mode>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <mode name='maximum' supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='maximumMigratable'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>on</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>off</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </mode>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <mode name='host-model' supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model fallback='forbid'>EPYC-Rome</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <vendor>AMD</vendor>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='x2apic'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='tsc-deadline'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='hypervisor'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='tsc_adjust'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='spec-ctrl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='stibp'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='arch-capabilities'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='ssbd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='cmp_legacy'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='overflow-recov'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='succor'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='ibrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='amd-ssbd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='virt-ssbd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='lbrv'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='tsc-scale'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='vmcb-clean'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='flushbyasid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='pause-filter'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='pfthreshold'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='svme-addr-chk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='lfence-always-serializing'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='rdctl-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='mds-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='pschange-mc-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='gds-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='require' name='rfds-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <feature policy='disable' name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </mode>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <mode name='custom' supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell-noTSX'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Broadwell-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cascadelake-Server'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cascadelake-Server-noTSX'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cascadelake-Server-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cascadelake-Server-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cascadelake-Server-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cascadelake-Server-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cascadelake-Server-v5'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cooperlake'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cooperlake-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Cooperlake-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Denverton'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mpx'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Denverton-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mpx'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Denverton-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Denverton-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Dhyana-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Genoa'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amd-psfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='auto-ibrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='no-nested-data-bp'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='null-sel-clr-base'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='stibp-always-on'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Genoa-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amd-psfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='auto-ibrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='no-nested-data-bp'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='null-sel-clr-base'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='stibp-always-on'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Milan'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Milan-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Milan-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amd-psfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='no-nested-data-bp'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='null-sel-clr-base'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='stibp-always-on'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Rome'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Rome-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Rome-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-Rome-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='EPYC-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='GraniteRapids'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-tile'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fbsdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrc'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fzrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mcdt-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pbrsb-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='prefetchiti'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='psdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='GraniteRapids-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-tile'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fbsdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrc'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fzrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mcdt-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pbrsb-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='prefetchiti'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='psdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='GraniteRapids-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-tile'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx10'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx10-128'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx10-256'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx10-512'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cldemote'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fbsdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrc'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fzrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mcdt-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdir64b'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdiri'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pbrsb-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='prefetchiti'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='psdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell-noTSX'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell-noTSX-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Haswell-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-noTSX'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-v5'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-v6'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Icelake-Server-v7'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='IvyBridge'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='IvyBridge-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='IvyBridge-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='IvyBridge-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='KnightsMill'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-4fmaps'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-4vnniw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512er'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512pf'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='KnightsMill-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-4fmaps'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-4vnniw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512er'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512pf'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Opteron_G4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fma4'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xop'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Opteron_G4-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fma4'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xop'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Opteron_G5'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fma4'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tbm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xop'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Opteron_G5-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fma4'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tbm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xop'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='SapphireRapids'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-tile'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrc'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fzrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='SapphireRapids-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-tile'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrc'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fzrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='SapphireRapids-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-tile'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fbsdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrc'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fzrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='psdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='SapphireRapids-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='amx-tile'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-bf16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-fp16'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bitalg'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cldemote'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fbsdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrc'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fzrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='la57'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdir64b'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdiri'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='psdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='taa-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xfd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='SierraForest'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-ne-convert'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cmpccxadd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fbsdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mcdt-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pbrsb-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='psdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='SierraForest-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-ifma'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-ne-convert'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx-vnni-int8'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cmpccxadd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fbsdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='fsrs'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ibrs-all'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mcdt-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pbrsb-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='psdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='serialize'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vaes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Client'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Client-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Client-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Client-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Client-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Client-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='hle'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='rtm'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Skylake-Server-v5'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512bw'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512cd'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512dq'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512f'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='avx512vl'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='invpcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pcid'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='pku'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Snowridge'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cldemote'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='core-capability'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdir64b'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdiri'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mpx'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='split-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Snowridge-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cldemote'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='core-capability'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdir64b'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdiri'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='mpx'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='split-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Snowridge-v2'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cldemote'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='core-capability'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdir64b'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdiri'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='split-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Snowridge-v3'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cldemote'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='core-capability'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdir64b'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdiri'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='split-lock-detect'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='Snowridge-v4'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='cldemote'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='erms'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='gfni'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdir64b'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='movdiri'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='xsaves'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='athlon'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnow'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnowext'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='athlon-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnow'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnowext'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='core2duo'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='core2duo-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='coreduo'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='coreduo-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='n270'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='n270-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='ss'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='phenom'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnow'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnowext'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <blockers model='phenom-v1'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnow'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <feature name='3dnowext'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </blockers>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </mode>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  </cpu>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <memoryBacking supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <enum name='sourceType'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <value>file</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <value>anonymous</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <value>memfd</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  </memoryBacking>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <devices>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <disk supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='diskDevice'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>disk</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>cdrom</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>floppy</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>lun</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='bus'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>ide</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>fdc</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>scsi</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>usb</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>sata</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='model'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio-transitional</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio-non-transitional</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </disk>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <graphics supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='type'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>vnc</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>egl-headless</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>dbus</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </graphics>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <video supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='modelType'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>vga</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>cirrus</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>none</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>bochs</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>ramfb</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </video>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <hostdev supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='mode'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>subsystem</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='startupPolicy'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>default</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>mandatory</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>requisite</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>optional</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='subsysType'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>usb</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>pci</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>scsi</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='capsType'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='pciBackend'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </hostdev>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <rng supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='model'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio-transitional</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtio-non-transitional</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='backendModel'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>random</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>egd</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>builtin</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </rng>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <filesystem supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='driverType'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>path</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>handle</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>virtiofs</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </filesystem>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <tpm supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='model'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>tpm-tis</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>tpm-crb</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='backendModel'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>emulator</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>external</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='backendVersion'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>2.0</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </tpm>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <redirdev supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='bus'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>usb</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </redirdev>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <channel supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='type'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>pty</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>unix</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </channel>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <crypto supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='model'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='type'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>qemu</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='backendModel'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>builtin</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </crypto>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <interface supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='backendType'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>default</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>passt</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </interface>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <panic supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='model'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>isa</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>hyperv</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </panic>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  </devices>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  <features>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <gic supported='no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <vmcoreinfo supported='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <genid supported='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <backingStoreInput supported='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <backup supported='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <async-teardown supported='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <ps2 supported='yes'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <sev supported='no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <sgx supported='no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <hyperv supported='yes'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      <enum name='features'>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>relaxed</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>vapic</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>spinlocks</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>vpindex</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>runtime</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>synic</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>stimer</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>reset</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>vendor_id</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>frequencies</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>reenlightenment</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>tlbflush</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>ipi</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>avic</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>emsr_bitmap</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:        <value>xmm_input</value>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:      </enum>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    </hyperv>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:    <launchSecurity supported='no'/>
Oct 10 06:05:40 np0005479823 nova_compute[234571]:  </features>
Oct 10 06:05:40 np0005479823 nova_compute[234571]: </domainCapabilities>
Oct 10 06:05:40 np0005479823 nova_compute[234571]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct 10 06:05:40 np0005479823 nova_compute[234571]: 2025-10-10 10:05:40.396 2 DEBUG nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Oct 10 06:05:40 np0005479823 nova_compute[234571]: 2025-10-10 10:05:40.397 2 INFO nova.virt.libvirt.host [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Secure Boot support detected#033[00m
Oct 10 06:05:40 np0005479823 nova_compute[234571]: 2025-10-10 10:05:40.398 2 INFO nova.virt.libvirt.driver [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Oct 10 06:05:40 np0005479823 nova_compute[234571]: 2025-10-10 10:05:40.398 2 INFO nova.virt.libvirt.driver [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Oct 10 06:05:40 np0005479823 nova_compute[234571]: 2025-10-10 10:05:40.408 2 DEBUG nova.virt.libvirt.driver [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Oct 10 06:05:40 np0005479823 nova_compute[234571]: 2025-10-10 10:05:40.448 2 INFO nova.virt.node [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Determined node identity dcdfa54c-9f95-46da-9af1-da3e28d81cf0 from /var/lib/nova/compute_id#033[00m
Oct 10 06:05:40 np0005479823 nova_compute[234571]: 2025-10-10 10:05:40.468 2 WARNING nova.compute.manager [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Compute nodes ['dcdfa54c-9f95-46da-9af1-da3e28d81cf0'] for host compute-2.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Oct 10 06:05:40 np0005479823 nova_compute[234571]: 2025-10-10 10:05:40.518 2 INFO nova.compute.manager [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Oct 10 06:05:40 np0005479823 nova_compute[234571]: 2025-10-10 10:05:40.588 2 WARNING nova.compute.manager [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] No compute node record found for host compute-2.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Oct 10 06:05:40 np0005479823 nova_compute[234571]: 2025-10-10 10:05:40.589 2 DEBUG oslo_concurrency.lockutils [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:05:40 np0005479823 nova_compute[234571]: 2025-10-10 10:05:40.589 2 DEBUG oslo_concurrency.lockutils [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:05:40 np0005479823 nova_compute[234571]: 2025-10-10 10:05:40.589 2 DEBUG oslo_concurrency.lockutils [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:05:40 np0005479823 nova_compute[234571]: 2025-10-10 10:05:40.589 2 DEBUG nova.compute.resource_tracker [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct 10 06:05:40 np0005479823 nova_compute[234571]: 2025-10-10 10:05:40.589 2 DEBUG oslo_concurrency.processutils [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:05:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:40 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003da0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:40 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:40 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868004b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:40 np0005479823 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 06:05:40 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 06:05:40 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:05:40 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:05:40 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 06:05:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:05:40 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3983368344' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:05:41 np0005479823 nova_compute[234571]: 2025-10-10 10:05:41.013 2 DEBUG oslo_concurrency.processutils [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:05:41 np0005479823 systemd[1]: Starting libvirt nodedev daemon...
Oct 10 06:05:41 np0005479823 systemd[1]: Started libvirt nodedev daemon.
Oct 10 06:05:41 np0005479823 nova_compute[234571]: 2025-10-10 10:05:41.323 2 WARNING nova.virt.libvirt.driver [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct 10 06:05:41 np0005479823 nova_compute[234571]: 2025-10-10 10:05:41.325 2 DEBUG nova.compute.resource_tracker [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5218MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct 10 06:05:41 np0005479823 nova_compute[234571]: 2025-10-10 10:05:41.325 2 DEBUG oslo_concurrency.lockutils [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:05:41 np0005479823 nova_compute[234571]: 2025-10-10 10:05:41.325 2 DEBUG oslo_concurrency.lockutils [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:05:41 np0005479823 nova_compute[234571]: 2025-10-10 10:05:41.338 2 WARNING nova.compute.resource_tracker [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] No compute node record for compute-2.ctlplane.example.com:dcdfa54c-9f95-46da-9af1-da3e28d81cf0: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host dcdfa54c-9f95-46da-9af1-da3e28d81cf0 could not be found.#033[00m
Oct 10 06:05:41 np0005479823 nova_compute[234571]: 2025-10-10 10:05:41.358 2 INFO nova.compute.resource_tracker [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Compute node record created for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com with uuid: dcdfa54c-9f95-46da-9af1-da3e28d81cf0#033[00m
Oct 10 06:05:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:41.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:41 np0005479823 nova_compute[234571]: 2025-10-10 10:05:41.439 2 DEBUG nova.compute.resource_tracker [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct 10 06:05:41 np0005479823 nova_compute[234571]: 2025-10-10 10:05:41.440 2 DEBUG nova.compute.resource_tracker [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct 10 06:05:41 np0005479823 python3.9[235711]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 10 06:05:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:05:41.456 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:05:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:05:41.457 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:05:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:05:41.457 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:05:41 np0005479823 systemd[1]: Stopping nova_compute container...
Oct 10 06:05:41 np0005479823 nova_compute[234571]: 2025-10-10 10:05:41.566 2 DEBUG oslo_concurrency.lockutils [None req-93d26c6c-d96a-40f5-8030-e3caa795904b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:05:41 np0005479823 nova_compute[234571]: 2025-10-10 10:05:41.567 2 DEBUG oslo_concurrency.lockutils [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct 10 06:05:41 np0005479823 nova_compute[234571]: 2025-10-10 10:05:41.567 2 DEBUG oslo_concurrency.lockutils [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct 10 06:05:41 np0005479823 nova_compute[234571]: 2025-10-10 10:05:41.567 2 DEBUG oslo_concurrency.lockutils [None req-940ec40a-d42f-4a26-9cd7-1188ad7fa579 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct 10 06:05:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:41 np0005479823 virtqemud[235088]: libvirt version: 10.10.0, package: 15.el9 (builder@centos.org, 2025-08-18-13:22:20, )
Oct 10 06:05:41 np0005479823 virtqemud[235088]: hostname: compute-2
Oct 10 06:05:41 np0005479823 virtqemud[235088]: End of file while reading data: Input/output error
Oct 10 06:05:41 np0005479823 systemd[1]: libpod-a677c95a28d87d0fa9989f1bbc85251f20b3e9e51830cb4a7abc069cf607269f.scope: Deactivated successfully.
Oct 10 06:05:41 np0005479823 systemd[1]: libpod-a677c95a28d87d0fa9989f1bbc85251f20b3e9e51830cb4a7abc069cf607269f.scope: Consumed 3.494s CPU time.
Oct 10 06:05:41 np0005479823 conmon[234571]: conmon a677c95a28d87d0fa998 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-a677c95a28d87d0fa9989f1bbc85251f20b3e9e51830cb4a7abc069cf607269f.scope/container/memory.events
Oct 10 06:05:41 np0005479823 podman[235717]: 2025-10-10 10:05:41.974520705 +0000 UTC m=+0.450426097 container died a677c95a28d87d0fa9989f1bbc85251f20b3e9e51830cb4a7abc069cf607269f (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute, tcib_managed=true, container_name=nova_compute, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, org.label-schema.vendor=CentOS)
Oct 10 06:05:41 np0005479823 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a677c95a28d87d0fa9989f1bbc85251f20b3e9e51830cb4a7abc069cf607269f-userdata-shm.mount: Deactivated successfully.
Oct 10 06:05:42 np0005479823 systemd[1]: var-lib-containers-storage-overlay-c42589750c71ce8f99ff19c16cbc797b53200611ad4b531f0f37c2c8d9ef3689-merged.mount: Deactivated successfully.
Oct 10 06:05:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:42.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:42 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:42 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003dc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:42 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:43.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:43 np0005479823 podman[235717]: 2025-10-10 10:05:43.395495796 +0000 UTC m=+1.871401108 container cleanup a677c95a28d87d0fa9989f1bbc85251f20b3e9e51830cb4a7abc069cf607269f (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, io.buildah.version=1.41.3, container_name=nova_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 06:05:43 np0005479823 podman[235717]: nova_compute
Oct 10 06:05:43 np0005479823 podman[235749]: nova_compute
Oct 10 06:05:43 np0005479823 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Oct 10 06:05:43 np0005479823 systemd[1]: Stopped nova_compute container.
Oct 10 06:05:43 np0005479823 systemd[1]: Starting nova_compute container...
Oct 10 06:05:43 np0005479823 systemd[1]: Started libcrun container.
Oct 10 06:05:43 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c42589750c71ce8f99ff19c16cbc797b53200611ad4b531f0f37c2c8d9ef3689/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 10 06:05:43 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c42589750c71ce8f99ff19c16cbc797b53200611ad4b531f0f37c2c8d9ef3689/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct 10 06:05:43 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c42589750c71ce8f99ff19c16cbc797b53200611ad4b531f0f37c2c8d9ef3689/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 10 06:05:43 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c42589750c71ce8f99ff19c16cbc797b53200611ad4b531f0f37c2c8d9ef3689/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 10 06:05:43 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c42589750c71ce8f99ff19c16cbc797b53200611ad4b531f0f37c2c8d9ef3689/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 10 06:05:43 np0005479823 podman[235759]: 2025-10-10 10:05:43.571801649 +0000 UTC m=+0.085114836 container init a677c95a28d87d0fa9989f1bbc85251f20b3e9e51830cb4a7abc069cf607269f (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 10 06:05:43 np0005479823 podman[235759]: 2025-10-10 10:05:43.579021679 +0000 UTC m=+0.092334836 container start a677c95a28d87d0fa9989f1bbc85251f20b3e9e51830cb4a7abc069cf607269f (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible)
Oct 10 06:05:43 np0005479823 podman[235759]: nova_compute
Oct 10 06:05:43 np0005479823 nova_compute[235775]: + sudo -E kolla_set_configs
Oct 10 06:05:43 np0005479823 systemd[1]: Started nova_compute container.
Oct 10 06:05:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Validating config file
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Copying service configuration files
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Deleting /etc/ceph
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Creating directory /etc/ceph
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Setting permission for /etc/ceph
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Writing out command to execute
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 10 06:05:43 np0005479823 nova_compute[235775]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 10 06:05:43 np0005479823 nova_compute[235775]: ++ cat /run_command
Oct 10 06:05:43 np0005479823 nova_compute[235775]: + CMD=nova-compute
Oct 10 06:05:43 np0005479823 nova_compute[235775]: + ARGS=
Oct 10 06:05:43 np0005479823 nova_compute[235775]: + sudo kolla_copy_cacerts
Oct 10 06:05:43 np0005479823 nova_compute[235775]: + [[ ! -n '' ]]
Oct 10 06:05:43 np0005479823 nova_compute[235775]: + . kolla_extend_start
Oct 10 06:05:43 np0005479823 nova_compute[235775]: Running command: 'nova-compute'
Oct 10 06:05:43 np0005479823 nova_compute[235775]: + echo 'Running command: '\''nova-compute'\'''
Oct 10 06:05:43 np0005479823 nova_compute[235775]: + umask 0022
Oct 10 06:05:43 np0005479823 nova_compute[235775]: + exec nova-compute
Oct 10 06:05:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:44.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:44 np0005479823 python3.9[235938]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct 10 06:05:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:44 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868004b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:44 np0005479823 systemd[1]: Started libpod-conmon-95a19b4fa8f1397f8b1735f74117bd5512d5516b425caa220dcdff41907f909e.scope.
Oct 10 06:05:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:44 np0005479823 systemd[1]: Started libcrun container.
Oct 10 06:05:44 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6010f60c692d8f88c6982f744376325ee0757b82dcf004acde8ac551cd6e318/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Oct 10 06:05:44 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6010f60c692d8f88c6982f744376325ee0757b82dcf004acde8ac551cd6e318/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Oct 10 06:05:44 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6010f60c692d8f88c6982f744376325ee0757b82dcf004acde8ac551cd6e318/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 10 06:05:44 np0005479823 podman[235965]: 2025-10-10 10:05:44.709731682 +0000 UTC m=+0.158546487 container init 95a19b4fa8f1397f8b1735f74117bd5512d5516b425caa220dcdff41907f909e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, container_name=nova_compute_init, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 06:05:44 np0005479823 podman[235965]: 2025-10-10 10:05:44.723479331 +0000 UTC m=+0.172294106 container start 95a19b4fa8f1397f8b1735f74117bd5512d5516b425caa220dcdff41907f909e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute_init, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct 10 06:05:44 np0005479823 python3.9[235938]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Oct 10 06:05:44 np0005479823 nova_compute_init[235986]: INFO:nova_statedir:Applying nova statedir ownership
Oct 10 06:05:44 np0005479823 nova_compute_init[235986]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Oct 10 06:05:44 np0005479823 nova_compute_init[235986]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Oct 10 06:05:44 np0005479823 nova_compute_init[235986]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Oct 10 06:05:44 np0005479823 nova_compute_init[235986]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Oct 10 06:05:44 np0005479823 nova_compute_init[235986]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Oct 10 06:05:44 np0005479823 nova_compute_init[235986]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Oct 10 06:05:44 np0005479823 nova_compute_init[235986]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Oct 10 06:05:44 np0005479823 nova_compute_init[235986]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Oct 10 06:05:44 np0005479823 nova_compute_init[235986]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Oct 10 06:05:44 np0005479823 nova_compute_init[235986]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Oct 10 06:05:44 np0005479823 nova_compute_init[235986]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Oct 10 06:05:44 np0005479823 nova_compute_init[235986]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Oct 10 06:05:44 np0005479823 nova_compute_init[235986]: INFO:nova_statedir:Nova statedir ownership complete
Oct 10 06:05:44 np0005479823 systemd[1]: libpod-95a19b4fa8f1397f8b1735f74117bd5512d5516b425caa220dcdff41907f909e.scope: Deactivated successfully.
Oct 10 06:05:44 np0005479823 podman[235987]: 2025-10-10 10:05:44.783979511 +0000 UTC m=+0.030250256 container died 95a19b4fa8f1397f8b1735f74117bd5512d5516b425caa220dcdff41907f909e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute_init, container_name=nova_compute_init, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct 10 06:05:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:44 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868004b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:44 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c003de0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:44 np0005479823 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-95a19b4fa8f1397f8b1735f74117bd5512d5516b425caa220dcdff41907f909e-userdata-shm.mount: Deactivated successfully.
Oct 10 06:05:44 np0005479823 systemd[1]: var-lib-containers-storage-overlay-e6010f60c692d8f88c6982f744376325ee0757b82dcf004acde8ac551cd6e318-merged.mount: Deactivated successfully.
Oct 10 06:05:44 np0005479823 podman[235996]: 2025-10-10 10:05:44.856992699 +0000 UTC m=+0.066698088 container cleanup 95a19b4fa8f1397f8b1735f74117bd5512d5516b425caa220dcdff41907f909e (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 10 06:05:44 np0005479823 systemd[1]: libpod-conmon-95a19b4fa8f1397f8b1735f74117bd5512d5516b425caa220dcdff41907f909e.scope: Deactivated successfully.
Oct 10 06:05:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:05:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:05:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:05:45.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:05:45 np0005479823 nova_compute[235775]: 2025-10-10 10:05:45.640 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct 10 06:05:45 np0005479823 nova_compute[235775]: 2025-10-10 10:05:45.640 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct 10 06:05:45 np0005479823 nova_compute[235775]: 2025-10-10 10:05:45.640 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct 10 06:05:45 np0005479823 nova_compute[235775]: 2025-10-10 10:05:45.640 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Oct 10 06:05:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:45 np0005479823 systemd[1]: session-54.scope: Deactivated successfully.
Oct 10 06:05:45 np0005479823 systemd[1]: session-54.scope: Consumed 2min 39.116s CPU time.
Oct 10 06:05:45 np0005479823 systemd-logind[796]: Session 54 logged out. Waiting for processes to exit.
Oct 10 06:05:45 np0005479823 systemd-logind[796]: Removed session 54.
Oct 10 06:05:45 np0005479823 nova_compute[235775]: 2025-10-10 10:05:45.774 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:05:45 np0005479823 nova_compute[235775]: 2025-10-10 10:05:45.804 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:05:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:05:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:05:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:05:46.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:05:46 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:05:46 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.244 2 INFO nova.virt.driver [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.350 2 INFO nova.compute.provider_config [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.358 2 DEBUG oslo_concurrency.lockutils [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.359 2 DEBUG oslo_concurrency.lockutils [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.359 2 DEBUG oslo_concurrency.lockutils [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.360 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.360 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.360 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.360 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.360 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.361 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.361 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.361 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.361 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.361 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.361 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.362 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.362 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.362 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.362 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.362 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.363 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.363 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.363 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.363 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.363 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.363 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.364 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.364 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.364 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.364 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.364 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.364 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.364 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.365 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.365 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.365 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.365 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.365 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.365 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.366 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.366 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.366 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.366 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.366 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.366 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.367 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.367 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.367 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.367 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.368 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.368 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.368 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.368 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.368 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.369 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.369 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.369 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.369 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.369 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.369 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.369 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.370 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.370 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.370 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.370 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.370 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.370 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.370 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.371 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.371 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.371 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.371 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.371 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.372 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.372 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.372 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.372 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.372 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.373 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.373 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.373 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.373 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.373 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.374 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.374 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.374 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.374 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.375 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.375 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.375 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.375 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.375 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.376 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.376 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.376 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.376 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.376 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.376 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.376 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.376 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.377 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.377 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.377 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.377 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.377 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.377 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.378 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.378 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.378 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.378 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.378 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.379 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.379 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.379 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.379 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.380 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.380 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.380 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.380 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.380 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.381 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.381 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.381 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.381 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.381 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.381 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.382 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.382 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.382 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.382 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.382 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.382 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.382 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.383 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.383 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.383 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.383 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.383 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.383 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.383 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.384 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.384 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.384 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.384 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.384 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.384 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.385 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.385 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.385 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.385 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.386 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.386 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.386 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.386 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.386 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.387 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.387 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.387 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.387 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.387 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.388 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.388 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.388 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.388 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.388 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.388 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.389 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.389 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.389 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.389 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.389 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.390 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.390 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.390 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.390 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.390 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.390 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.390 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.391 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.391 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.391 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.391 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.391 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.391 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.392 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.392 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.392 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.392 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.392 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.392 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.393 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.393 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.393 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.393 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.393 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.394 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.394 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.394 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.394 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.394 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.394 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.395 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.395 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.395 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.395 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.395 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.395 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.395 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.396 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.396 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.396 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.396 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.396 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.396 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.396 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.397 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.397 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.397 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.397 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.397 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.397 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.397 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.398 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.398 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.398 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.398 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.398 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.398 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.398 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.399 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.399 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.399 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.399 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.399 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.399 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.400 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.400 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.400 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.400 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.400 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.400 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.400 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.401 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.401 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.401 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.401 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.401 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.401 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.401 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.402 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.402 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.402 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.402 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.402 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.402 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.402 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.403 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.403 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.403 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.403 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.403 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.403 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.403 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.403 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.404 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.404 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.404 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.404 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.404 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.404 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.404 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.405 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.405 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.405 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.405 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.405 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.405 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.405 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.406 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.406 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.406 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.406 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.406 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.406 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.406 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.407 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.407 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.407 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.407 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.407 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.407 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.407 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.407 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.408 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.408 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.408 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.408 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.408 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.408 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.408 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.409 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.409 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.409 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.409 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.409 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.409 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.409 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.410 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.410 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.410 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.410 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.410 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.410 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.410 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.411 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.411 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.411 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.411 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.411 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.411 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.412 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.412 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.412 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.412 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.412 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.412 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.412 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.413 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.413 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.413 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.413 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.413 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.413 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.414 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.414 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.414 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.414 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.414 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.414 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.414 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.415 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.415 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.415 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.415 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.416 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.416 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.416 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.416 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.416 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.416 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.416 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.417 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.417 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.417 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.417 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.417 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.417 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.418 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.418 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.418 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.418 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.418 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.419 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.419 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.419 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.419 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.419 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.420 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.420 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.420 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.420 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.420 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.421 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.421 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.421 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.421 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.421 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.421 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.422 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.422 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.422 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.422 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.422 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.422 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.423 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.423 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.423 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.423 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.423 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.423 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.424 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.424 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.424 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.424 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.424 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.424 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.424 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.425 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.425 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.425 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.425 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.425 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.425 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.426 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.426 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.426 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.426 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.426 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.426 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.426 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.427 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.427 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.427 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.427 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.427 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.427 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.428 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.428 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.428 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.428 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.428 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.429 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.429 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.429 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.429 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.429 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.429 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.429 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.430 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.430 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.430 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.430 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.430 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.430 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.430 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.431 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.431 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.431 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.431 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.431 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.431 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.432 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.432 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.432 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.432 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.432 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.432 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.432 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.433 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.433 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.433 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.433 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.433 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.433 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.433 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.434 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.434 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.434 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.434 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.435 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.435 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.435 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.435 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.435 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.436 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.436 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.436 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.436 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.436 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.437 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.437 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.437 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.437 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.437 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.438 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.438 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.438 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.438 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.438 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.439 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.439 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.439 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.439 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.440 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.440 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.440 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.440 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.440 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.441 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.441 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.441 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.441 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.441 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.442 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.442 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.442 2 WARNING oslo_config.cfg [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct 10 06:05:46 np0005479823 nova_compute[235775]: live_migration_uri is deprecated for removal in favor of two other options that
Oct 10 06:05:46 np0005479823 nova_compute[235775]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct 10 06:05:46 np0005479823 nova_compute[235775]: and ``live_migration_inbound_addr`` respectively.
Oct 10 06:05:46 np0005479823 nova_compute[235775]: ).  Its value may be silently ignored in the future.#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.443 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.443 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.443 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.443 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.443 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.444 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.444 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.444 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.444 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.444 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.445 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.445 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.445 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.445 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.446 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.446 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.446 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.446 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.446 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.rbd_secret_uuid        = 21f084a3-af34-5230-afe4-ea5cd24a55f4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.447 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.447 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.447 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.447 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.448 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.448 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.448 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.448 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.448 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.449 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.449 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.449 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.449 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.449 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.450 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.450 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.450 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.450 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.450 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.451 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.451 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.451 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.451 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.451 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.452 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.452 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.452 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.452 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.452 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.453 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.453 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.453 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.453 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.453 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.454 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.454 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.454 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.454 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.454 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.455 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.455 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.455 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.455 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.455 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.456 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.456 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.456 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.456 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.456 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.456 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.457 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.457 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.457 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.457 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.457 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.458 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.458 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.458 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.458 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.458 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.459 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.459 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.459 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.459 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.459 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.460 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.460 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.460 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.460 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.460 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.461 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.461 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.461 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.461 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.461 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.462 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.462 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.462 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.462 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.462 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.463 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.463 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.463 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.463 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.463 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.464 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.464 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.464 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.464 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.464 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.465 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.465 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.465 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.465 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.465 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.466 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.466 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.466 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.466 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.466 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.467 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.467 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.467 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.467 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.467 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.468 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.468 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.468 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.468 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.468 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.469 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.469 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.469 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.469 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.469 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.470 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.470 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.470 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.470 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.470 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.471 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.471 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.471 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.471 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.472 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.472 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.472 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.472 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.472 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.473 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.473 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.473 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.473 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.473 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.474 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.474 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.474 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.474 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.474 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.475 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.475 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.475 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.475 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.475 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.476 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.476 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.476 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.476 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.476 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.477 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.477 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.477 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.477 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.478 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.478 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.478 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.478 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.478 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.479 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.479 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.479 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.479 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.480 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.480 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.480 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.480 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.481 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.481 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.481 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.481 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.481 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.482 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.482 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.482 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.482 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.482 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.483 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.483 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.483 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.483 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.484 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.484 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.484 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.484 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.484 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.485 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.485 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.485 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.485 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.485 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.485 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.486 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.486 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.486 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.486 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.486 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.487 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.487 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.487 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.487 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.487 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.488 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.488 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.488 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.488 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.488 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.489 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.489 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.489 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.489 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.490 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.490 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.490 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.490 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.490 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.490 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.491 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.491 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.491 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.491 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.491 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.492 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.492 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.492 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.492 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.492 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.493 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.493 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.493 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.493 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.494 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.494 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.494 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.494 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.495 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.495 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.495 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.495 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.495 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.496 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.496 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.496 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.496 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.496 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.497 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.497 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.497 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.497 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.497 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.498 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.498 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.498 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.498 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.498 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.499 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.499 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.499 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.499 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.499 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.500 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.500 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.500 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.500 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.500 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.501 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.501 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.501 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.501 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.501 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.502 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.502 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.502 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.502 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.502 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.503 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.503 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.503 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.503 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.503 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.504 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.504 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.504 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.504 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.504 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.505 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.505 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.505 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.505 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.506 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.506 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.506 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.507 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.507 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.507 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.507 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.508 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.508 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.508 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.508 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.508 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.509 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.509 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.509 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.509 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.510 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.510 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.510 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.510 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.510 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.511 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.511 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.511 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.511 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.511 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.511 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.512 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.512 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.512 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.512 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.512 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.512 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.513 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.513 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.513 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.513 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.513 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.513 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.514 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.514 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.514 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.514 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.514 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.514 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.514 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.514 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.515 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.515 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.515 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.515 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.515 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.515 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.515 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.516 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.516 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.516 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.516 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.516 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.517 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.517 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.517 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.517 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.517 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.518 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.518 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.518 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.518 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.518 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.518 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.519 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.519 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.519 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.519 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.519 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.519 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.519 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.520 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.520 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.520 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.520 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.520 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.520 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.520 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.521 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.521 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.521 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.521 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.521 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.521 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.522 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.522 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.522 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.522 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.522 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.522 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.523 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.523 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.523 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.523 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.523 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.523 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.523 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.524 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.524 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.524 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.524 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.524 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.525 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.525 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.525 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.525 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.525 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.526 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.526 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.526 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.526 2 DEBUG oslo_service.service [None req-4e8231f7-96f9-48cb-96ca-3e8b409b541e - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.527 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.570 2 INFO nova.virt.node [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Determined node identity dcdfa54c-9f95-46da-9af1-da3e28d81cf0 from /var/lib/nova/compute_id#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.571 2 DEBUG nova.virt.libvirt.host [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.572 2 DEBUG nova.virt.libvirt.host [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.572 2 DEBUG nova.virt.libvirt.host [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.572 2 DEBUG nova.virt.libvirt.host [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.587 2 DEBUG nova.virt.libvirt.host [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f9eb1678760> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.590 2 DEBUG nova.virt.libvirt.host [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f9eb1678760> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.591 2 INFO nova.virt.libvirt.driver [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Connection event '1' reason 'None'#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.596 2 INFO nova.virt.libvirt.host [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Libvirt host capabilities <capabilities>
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <host>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <uuid>55d065af-0252-4401-ad6e-822a36bead06</uuid>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <cpu>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <arch>x86_64</arch>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model>EPYC-Rome-v4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <vendor>AMD</vendor>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <microcode version='16777317'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <signature family='23' model='49' stepping='0'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <maxphysaddr mode='emulate' bits='40'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature name='x2apic'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature name='tsc-deadline'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature name='osxsave'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature name='hypervisor'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature name='tsc_adjust'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature name='spec-ctrl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature name='stibp'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature name='arch-capabilities'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature name='ssbd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature name='cmp_legacy'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature name='topoext'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature name='virt-ssbd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature name='lbrv'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature name='tsc-scale'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature name='vmcb-clean'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature name='pause-filter'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature name='pfthreshold'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature name='svme-addr-chk'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature name='rdctl-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature name='skip-l1dfl-vmentry'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature name='mds-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature name='pschange-mc-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <pages unit='KiB' size='4'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <pages unit='KiB' size='2048'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <pages unit='KiB' size='1048576'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </cpu>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <power_management>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <suspend_mem/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </power_management>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <iommu support='no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <migration_features>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <live/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <uri_transports>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <uri_transport>tcp</uri_transport>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <uri_transport>rdma</uri_transport>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </uri_transports>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </migration_features>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <topology>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <cells num='1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <cell id='0'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:          <memory unit='KiB'>7864356</memory>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:          <pages unit='KiB' size='4'>1966089</pages>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:          <pages unit='KiB' size='2048'>0</pages>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:          <pages unit='KiB' size='1048576'>0</pages>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:          <distances>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:            <sibling id='0' value='10'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:          </distances>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:          <cpus num='8'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:          </cpus>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        </cell>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </cells>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </topology>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <cache>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </cache>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <secmodel>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model>selinux</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <doi>0</doi>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </secmodel>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <secmodel>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model>dac</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <doi>0</doi>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <baselabel type='kvm'>+107:+107</baselabel>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <baselabel type='qemu'>+107:+107</baselabel>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </secmodel>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  </host>
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <guest>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <os_type>hvm</os_type>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <arch name='i686'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <wordsize>32</wordsize>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <domain type='qemu'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <domain type='kvm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </arch>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <features>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <pae/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <nonpae/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <acpi default='on' toggle='yes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <apic default='on' toggle='no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <cpuselection/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <deviceboot/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <disksnapshot default='on' toggle='no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <externalSnapshot/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </features>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  </guest>
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <guest>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <os_type>hvm</os_type>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <arch name='x86_64'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <wordsize>64</wordsize>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <domain type='qemu'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <domain type='kvm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </arch>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <features>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <acpi default='on' toggle='yes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <apic default='on' toggle='no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <cpuselection/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <deviceboot/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <disksnapshot default='on' toggle='no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <externalSnapshot/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </features>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  </guest>
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 
Oct 10 06:05:46 np0005479823 nova_compute[235775]: </capabilities>
Oct 10 06:05:46 np0005479823 nova_compute[235775]: #033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.606 2 DEBUG nova.virt.libvirt.host [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.613 2 DEBUG nova.virt.libvirt.host [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Oct 10 06:05:46 np0005479823 nova_compute[235775]: <domainCapabilities>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <path>/usr/libexec/qemu-kvm</path>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <domain>kvm</domain>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <machine>pc-q35-rhel9.6.0</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <arch>i686</arch>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <vcpu max='4096'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <iothreads supported='yes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <os supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <enum name='firmware'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <loader supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='type'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>rom</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>pflash</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='readonly'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>yes</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>no</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='secure'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>no</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </loader>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  </os>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <cpu>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <mode name='host-passthrough' supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='hostPassthroughMigratable'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>on</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>off</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </mode>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <mode name='maximum' supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='maximumMigratable'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>on</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>off</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </mode>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <mode name='host-model' supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model fallback='forbid'>EPYC-Rome</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <vendor>AMD</vendor>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='x2apic'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='tsc-deadline'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='hypervisor'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='tsc_adjust'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='spec-ctrl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='stibp'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='arch-capabilities'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='ssbd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='cmp_legacy'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='overflow-recov'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='succor'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='ibrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='amd-ssbd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='virt-ssbd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='lbrv'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='tsc-scale'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='vmcb-clean'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='flushbyasid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='pause-filter'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='pfthreshold'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='svme-addr-chk'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='lfence-always-serializing'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='rdctl-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='mds-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='pschange-mc-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='gds-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='rfds-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='disable' name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </mode>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <mode name='custom' supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Broadwell'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Broadwell-IBRS'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Broadwell-noTSX'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Broadwell-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Broadwell-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Broadwell-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Broadwell-v4'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cascadelake-Server'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cascadelake-Server-noTSX'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cascadelake-Server-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cascadelake-Server-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cascadelake-Server-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cascadelake-Server-v4'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cascadelake-Server-v5'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cooperlake'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cooperlake-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cooperlake-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Denverton'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='mpx'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Denverton-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='mpx'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Denverton-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Denverton-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Dhyana-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-Genoa'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amd-psfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='auto-ibrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='no-nested-data-bp'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='null-sel-clr-base'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='stibp-always-on'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-Genoa-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amd-psfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='auto-ibrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='no-nested-data-bp'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='null-sel-clr-base'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='stibp-always-on'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-Milan'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-Milan-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-Milan-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amd-psfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='no-nested-data-bp'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='null-sel-clr-base'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='stibp-always-on'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-Rome'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-Rome-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-Rome-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-Rome-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-v4'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='GraniteRapids'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-int8'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-tile'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fbsdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrc'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fzrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='mcdt-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pbrsb-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='prefetchiti'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='psdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='serialize'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='GraniteRapids-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-int8'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-tile'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fbsdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrc'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fzrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='mcdt-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pbrsb-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='prefetchiti'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='psdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='serialize'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='GraniteRapids-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-int8'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-tile'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx10'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx10-128'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx10-256'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx10-512'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='cldemote'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fbsdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrc'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fzrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='mcdt-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdir64b'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdiri'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pbrsb-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='prefetchiti'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='psdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='serialize'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Haswell'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Haswell-IBRS'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Haswell-noTSX'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Haswell-noTSX-IBRS'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Haswell-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Haswell-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Haswell-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Haswell-v4'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Icelake-Server'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Icelake-Server-noTSX'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Icelake-Server-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Icelake-Server-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Icelake-Server-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Icelake-Server-v4'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Icelake-Server-v5'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Icelake-Server-v6'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:05:46 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Icelake-Server-v7'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='IvyBridge'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='IvyBridge-IBRS'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='IvyBridge-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='IvyBridge-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='KnightsMill'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-4fmaps'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-4vnniw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512er'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512pf'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='KnightsMill-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-4fmaps'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-4vnniw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512er'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512pf'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Opteron_G4'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fma4'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xop'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Opteron_G4-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fma4'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xop'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Opteron_G5'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fma4'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='tbm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xop'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Opteron_G5-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fma4'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='tbm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xop'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='SapphireRapids'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-int8'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-tile'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrc'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fzrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='serialize'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='SapphireRapids-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-int8'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-tile'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrc'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fzrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='serialize'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='SapphireRapids-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-int8'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-tile'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fbsdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrc'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fzrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='psdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='serialize'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='SapphireRapids-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-int8'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-tile'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='cldemote'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fbsdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrc'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fzrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdir64b'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdiri'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='psdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='serialize'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='SierraForest'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-ne-convert'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni-int8'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='cmpccxadd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fbsdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='mcdt-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pbrsb-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='psdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='serialize'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='SierraForest-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-ne-convert'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni-int8'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='cmpccxadd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fbsdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='mcdt-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pbrsb-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='psdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='serialize'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Client'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:05:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:05:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Client-IBRS'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Client-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Client-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Client-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Client-v4'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Server'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Server-IBRS'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Server-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Server-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Server-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Server-v4'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Server-v5'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Snowridge'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='cldemote'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='core-capability'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdir64b'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdiri'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='mpx'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='split-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Snowridge-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='cldemote'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='core-capability'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdir64b'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdiri'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='mpx'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='split-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Snowridge-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='cldemote'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='core-capability'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdir64b'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdiri'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='split-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Snowridge-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='cldemote'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='core-capability'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdir64b'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdiri'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='split-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Snowridge-v4'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='cldemote'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdir64b'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdiri'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='athlon'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='3dnow'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='3dnowext'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='athlon-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='3dnow'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='3dnowext'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='core2duo'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='core2duo-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='coreduo'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='coreduo-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='n270'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='n270-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='phenom'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='3dnow'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='3dnowext'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='phenom-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='3dnow'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='3dnowext'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </mode>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  </cpu>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <memoryBacking supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <enum name='sourceType'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <value>file</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <value>anonymous</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <value>memfd</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  </memoryBacking>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <devices>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <disk supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='diskDevice'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>disk</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>cdrom</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>floppy</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>lun</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='bus'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>fdc</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>scsi</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>virtio</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>usb</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>sata</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='model'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>virtio</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>virtio-transitional</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>virtio-non-transitional</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </disk>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <graphics supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='type'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>vnc</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>egl-headless</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>dbus</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </graphics>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <video supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='modelType'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>vga</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>cirrus</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>virtio</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>none</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>bochs</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>ramfb</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </video>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <hostdev supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='mode'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>subsystem</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='startupPolicy'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>default</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>mandatory</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>requisite</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>optional</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='subsysType'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>usb</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>pci</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>scsi</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='capsType'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='pciBackend'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </hostdev>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <rng supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='model'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>virtio</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>virtio-transitional</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>virtio-non-transitional</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='backendModel'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>random</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>egd</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>builtin</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </rng>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <filesystem supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='driverType'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>path</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>handle</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>virtiofs</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </filesystem>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <tpm supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='model'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>tpm-tis</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>tpm-crb</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='backendModel'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>emulator</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>external</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='backendVersion'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>2.0</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </tpm>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <redirdev supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='bus'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>usb</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </redirdev>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <channel supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='type'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>pty</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>unix</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </channel>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <crypto supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='model'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='type'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>qemu</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='backendModel'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>builtin</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </crypto>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <interface supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='backendType'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>default</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>passt</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </interface>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <panic supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='model'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>isa</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>hyperv</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </panic>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  </devices>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <features>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <gic supported='no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <vmcoreinfo supported='yes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <genid supported='yes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <backingStoreInput supported='yes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <backup supported='yes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <async-teardown supported='yes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <ps2 supported='yes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <sev supported='no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <sgx supported='no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <hyperv supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='features'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>relaxed</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>vapic</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>spinlocks</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>vpindex</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>runtime</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>synic</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>stimer</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>reset</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>vendor_id</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>frequencies</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>reenlightenment</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>tlbflush</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>ipi</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>avic</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>emsr_bitmap</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>xmm_input</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </hyperv>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <launchSecurity supported='no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  </features>
Oct 10 06:05:46 np0005479823 nova_compute[235775]: </domainCapabilities>
Oct 10 06:05:46 np0005479823 nova_compute[235775]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.618 2 DEBUG nova.virt.libvirt.host [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Oct 10 06:05:46 np0005479823 nova_compute[235775]: <domainCapabilities>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <path>/usr/libexec/qemu-kvm</path>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <domain>kvm</domain>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <machine>pc-i440fx-rhel7.6.0</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <arch>i686</arch>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <vcpu max='240'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <iothreads supported='yes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <os supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <enum name='firmware'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <loader supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='type'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>rom</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>pflash</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='readonly'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>yes</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>no</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='secure'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>no</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </loader>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  </os>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <cpu>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <mode name='host-passthrough' supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='hostPassthroughMigratable'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>on</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>off</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </mode>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <mode name='maximum' supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='maximumMigratable'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>on</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>off</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </mode>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <mode name='host-model' supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model fallback='forbid'>EPYC-Rome</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <vendor>AMD</vendor>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='x2apic'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='tsc-deadline'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='hypervisor'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='tsc_adjust'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='spec-ctrl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='stibp'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='arch-capabilities'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='ssbd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='cmp_legacy'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='overflow-recov'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='succor'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='ibrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='amd-ssbd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='virt-ssbd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='lbrv'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='tsc-scale'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='vmcb-clean'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='flushbyasid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='pause-filter'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='pfthreshold'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='svme-addr-chk'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='lfence-always-serializing'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='rdctl-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='mds-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='pschange-mc-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='gds-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='rfds-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='disable' name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </mode>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <mode name='custom' supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Broadwell'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Broadwell-IBRS'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Broadwell-noTSX'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Broadwell-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Broadwell-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Broadwell-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Broadwell-v4'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cascadelake-Server'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cascadelake-Server-noTSX'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cascadelake-Server-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cascadelake-Server-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cascadelake-Server-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cascadelake-Server-v4'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cascadelake-Server-v5'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cooperlake'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cooperlake-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cooperlake-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Denverton'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='mpx'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Denverton-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='mpx'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Denverton-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Denverton-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Dhyana-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-Genoa'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amd-psfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='auto-ibrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='no-nested-data-bp'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='null-sel-clr-base'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='stibp-always-on'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-Genoa-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amd-psfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='auto-ibrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='no-nested-data-bp'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='null-sel-clr-base'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='stibp-always-on'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-Milan'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-Milan-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-Milan-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amd-psfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='no-nested-data-bp'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='null-sel-clr-base'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='stibp-always-on'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-Rome'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-Rome-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-Rome-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-Rome-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-v4'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='GraniteRapids'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-int8'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-tile'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fbsdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrc'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fzrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='mcdt-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pbrsb-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='prefetchiti'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='psdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='serialize'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='GraniteRapids-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-int8'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-tile'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fbsdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrc'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fzrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='mcdt-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pbrsb-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='prefetchiti'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='psdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='serialize'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='GraniteRapids-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-int8'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-tile'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx10'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx10-128'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx10-256'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx10-512'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='cldemote'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fbsdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrc'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fzrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='mcdt-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdir64b'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdiri'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pbrsb-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='prefetchiti'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='psdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='serialize'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Haswell'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Haswell-IBRS'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Haswell-noTSX'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Haswell-noTSX-IBRS'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Haswell-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Haswell-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Haswell-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Haswell-v4'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Icelake-Server'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Icelake-Server-noTSX'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Icelake-Server-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Icelake-Server-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Icelake-Server-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Icelake-Server-v4'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Icelake-Server-v5'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Icelake-Server-v6'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Icelake-Server-v7'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='IvyBridge'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='IvyBridge-IBRS'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='IvyBridge-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='IvyBridge-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='KnightsMill'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-4fmaps'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-4vnniw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512er'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512pf'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='KnightsMill-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-4fmaps'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-4vnniw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512er'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512pf'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Opteron_G4'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fma4'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xop'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Opteron_G4-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fma4'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xop'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Opteron_G5'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fma4'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='tbm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xop'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Opteron_G5-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fma4'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='tbm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xop'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='SapphireRapids'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-int8'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-tile'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrc'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fzrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='serialize'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='SapphireRapids-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-int8'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-tile'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrc'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fzrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='serialize'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='SapphireRapids-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-int8'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-tile'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fbsdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrc'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fzrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='psdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='serialize'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='SapphireRapids-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-int8'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-tile'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='cldemote'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fbsdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrc'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fzrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdir64b'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdiri'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='psdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='serialize'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='SierraForest'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-ne-convert'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni-int8'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='cmpccxadd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fbsdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='mcdt-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pbrsb-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='psdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='serialize'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='SierraForest-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-ne-convert'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni-int8'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='cmpccxadd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fbsdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='mcdt-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pbrsb-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='psdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='serialize'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Client'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Client-IBRS'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Client-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Client-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Client-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Client-v4'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Server'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Server-IBRS'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Server-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Server-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Server-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Server-v4'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Server-v5'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Snowridge'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='cldemote'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='core-capability'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdir64b'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdiri'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='mpx'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='split-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Snowridge-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='cldemote'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='core-capability'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdir64b'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdiri'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='mpx'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='split-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Snowridge-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='cldemote'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='core-capability'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdir64b'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdiri'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='split-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Snowridge-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='cldemote'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='core-capability'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdir64b'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdiri'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='split-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Snowridge-v4'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='cldemote'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdir64b'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdiri'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='athlon'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='3dnow'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='3dnowext'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='athlon-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='3dnow'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='3dnowext'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='core2duo'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='core2duo-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='coreduo'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='coreduo-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='n270'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='n270-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='phenom'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='3dnow'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='3dnowext'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='phenom-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='3dnow'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='3dnowext'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </mode>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  </cpu>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <memoryBacking supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <enum name='sourceType'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <value>file</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <value>anonymous</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <value>memfd</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  </memoryBacking>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <devices>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <disk supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='diskDevice'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>disk</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>cdrom</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>floppy</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>lun</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='bus'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>ide</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>fdc</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>scsi</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>virtio</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>usb</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>sata</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='model'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>virtio</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>virtio-transitional</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>virtio-non-transitional</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </disk>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <graphics supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='type'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>vnc</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>egl-headless</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>dbus</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </graphics>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <video supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='modelType'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>vga</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>cirrus</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>virtio</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>none</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>bochs</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>ramfb</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </video>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <hostdev supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='mode'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>subsystem</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='startupPolicy'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>default</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>mandatory</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>requisite</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>optional</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='subsysType'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>usb</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>pci</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>scsi</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='capsType'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='pciBackend'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </hostdev>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <rng supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='model'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>virtio</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>virtio-transitional</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>virtio-non-transitional</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='backendModel'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>random</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>egd</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>builtin</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </rng>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <filesystem supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='driverType'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>path</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>handle</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>virtiofs</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </filesystem>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <tpm supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='model'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>tpm-tis</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>tpm-crb</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='backendModel'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>emulator</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>external</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='backendVersion'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>2.0</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </tpm>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <redirdev supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='bus'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>usb</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </redirdev>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <channel supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='type'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>pty</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>unix</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </channel>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <crypto supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='model'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='type'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>qemu</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='backendModel'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>builtin</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </crypto>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <interface supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='backendType'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>default</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>passt</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </interface>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <panic supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='model'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>isa</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>hyperv</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </panic>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  </devices>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <features>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <gic supported='no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <vmcoreinfo supported='yes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <genid supported='yes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <backingStoreInput supported='yes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <backup supported='yes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <async-teardown supported='yes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <ps2 supported='yes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <sev supported='no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <sgx supported='no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <hyperv supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='features'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>relaxed</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>vapic</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>spinlocks</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>vpindex</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>runtime</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>synic</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>stimer</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>reset</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>vendor_id</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>frequencies</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>reenlightenment</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>tlbflush</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>ipi</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>avic</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>emsr_bitmap</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>xmm_input</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </hyperv>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <launchSecurity supported='no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  </features>
Oct 10 06:05:46 np0005479823 nova_compute[235775]: </domainCapabilities>
Oct 10 06:05:46 np0005479823 nova_compute[235775]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.661 2 DEBUG nova.virt.libvirt.host [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct 10 06:05:46 np0005479823 nova_compute[235775]: 2025-10-10 10:05:46.664 2 DEBUG nova.virt.libvirt.host [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Oct 10 06:05:46 np0005479823 nova_compute[235775]: <domainCapabilities>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <path>/usr/libexec/qemu-kvm</path>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <domain>kvm</domain>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <machine>pc-q35-rhel9.6.0</machine>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <arch>x86_64</arch>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <vcpu max='4096'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <iothreads supported='yes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <os supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <enum name='firmware'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <value>efi</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <loader supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='type'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>rom</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>pflash</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='readonly'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>yes</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>no</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='secure'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>yes</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>no</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </loader>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  </os>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <cpu>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <mode name='host-passthrough' supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='hostPassthroughMigratable'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>on</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>off</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </mode>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <mode name='maximum' supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='maximumMigratable'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>on</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>off</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </mode>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <mode name='host-model' supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model fallback='forbid'>EPYC-Rome</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <vendor>AMD</vendor>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='x2apic'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='tsc-deadline'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='hypervisor'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='tsc_adjust'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='spec-ctrl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='stibp'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='arch-capabilities'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='ssbd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='cmp_legacy'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='overflow-recov'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='succor'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='ibrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='amd-ssbd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='virt-ssbd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='lbrv'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='tsc-scale'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='vmcb-clean'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='flushbyasid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='pause-filter'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='pfthreshold'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='svme-addr-chk'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='lfence-always-serializing'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='rdctl-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='mds-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='pschange-mc-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='gds-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='require' name='rfds-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <feature policy='disable' name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </mode>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <mode name='custom' supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Broadwell'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Broadwell-IBRS'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Broadwell-noTSX'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Broadwell-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Broadwell-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Broadwell-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Broadwell-v4'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cascadelake-Server'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cascadelake-Server-noTSX'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cascadelake-Server-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cascadelake-Server-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cascadelake-Server-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cascadelake-Server-v4'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cascadelake-Server-v5'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cooperlake'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cooperlake-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Cooperlake-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Denverton'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='mpx'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Denverton-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='mpx'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Denverton-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Denverton-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Dhyana-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-Genoa'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amd-psfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='auto-ibrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='no-nested-data-bp'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='null-sel-clr-base'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='stibp-always-on'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-Genoa-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amd-psfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='auto-ibrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='no-nested-data-bp'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='null-sel-clr-base'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='stibp-always-on'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-Milan'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-Milan-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-Milan-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amd-psfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='no-nested-data-bp'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='null-sel-clr-base'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='stibp-always-on'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-Rome'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-Rome-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-Rome-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-Rome-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='EPYC-v4'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='GraniteRapids'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-int8'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-tile'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fbsdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrc'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fzrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='mcdt-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pbrsb-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='prefetchiti'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='psdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='serialize'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='GraniteRapids-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-int8'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-tile'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fbsdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrc'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fzrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='mcdt-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pbrsb-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='prefetchiti'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='psdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='serialize'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='GraniteRapids-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-int8'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-tile'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx10'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx10-128'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx10-256'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx10-512'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='cldemote'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fbsdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrc'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fzrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='mcdt-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdir64b'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdiri'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pbrsb-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='prefetchiti'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='psdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='serialize'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Haswell'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Haswell-IBRS'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Haswell-noTSX'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Haswell-noTSX-IBRS'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Haswell-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Haswell-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Haswell-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Haswell-v4'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Icelake-Server'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Icelake-Server-noTSX'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Icelake-Server-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Icelake-Server-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Icelake-Server-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Icelake-Server-v4'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Icelake-Server-v5'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Icelake-Server-v6'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Icelake-Server-v7'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='IvyBridge'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='IvyBridge-IBRS'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='IvyBridge-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='IvyBridge-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='KnightsMill'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-4fmaps'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-4vnniw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512er'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512pf'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='KnightsMill-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-4fmaps'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-4vnniw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512er'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512pf'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Opteron_G4'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fma4'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xop'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Opteron_G4-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fma4'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xop'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Opteron_G5'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fma4'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='tbm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xop'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Opteron_G5-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fma4'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='tbm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xop'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='SapphireRapids'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-int8'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-tile'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrc'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fzrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='serialize'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='SapphireRapids-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-int8'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-tile'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrc'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fzrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='serialize'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='SapphireRapids-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-int8'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-tile'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fbsdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrc'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fzrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='psdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='serialize'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='SapphireRapids-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-int8'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='amx-tile'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-bf16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-fp16'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512-vpopcntdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bitalg'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vbmi2'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='cldemote'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fbsdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrc'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fzrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='la57'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdir64b'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdiri'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='psdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='serialize'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='taa-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='tsx-ldtrk'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xfd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='SierraForest'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-ne-convert'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni-int8'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='cmpccxadd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fbsdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='mcdt-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pbrsb-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='psdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='serialize'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='SierraForest-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-ifma'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-ne-convert'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx-vnni-int8'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='bus-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='cmpccxadd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fbsdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='fsrs'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ibrs-all'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='mcdt-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pbrsb-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='psdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='sbdr-ssdp-no'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='serialize'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vaes'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='vpclmulqdq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Client'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Client-IBRS'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Client-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Client-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Client-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Client-v4'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Server'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Server-IBRS'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Server-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Server-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='hle'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='rtm'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Server-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Server-v4'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Skylake-Server-v5'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512bw'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512cd'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512dq'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512f'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='avx512vl'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='invpcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pcid'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='pku'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Snowridge'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='cldemote'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='core-capability'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdir64b'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdiri'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='mpx'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='split-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Snowridge-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='cldemote'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='core-capability'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdir64b'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdiri'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='mpx'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='split-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Snowridge-v2'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='cldemote'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='core-capability'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdir64b'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdiri'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='split-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Snowridge-v3'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='cldemote'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='core-capability'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdir64b'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdiri'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='split-lock-detect'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='Snowridge-v4'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='cldemote'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='erms'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='gfni'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdir64b'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='movdiri'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='xsaves'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='athlon'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='3dnow'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='3dnowext'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='athlon-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='3dnow'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='3dnowext'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='core2duo'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='core2duo-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='coreduo'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='coreduo-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='n270'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='n270-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='ss'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='phenom'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='3dnow'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='3dnowext'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <blockers model='phenom-v1'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='3dnow'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <feature name='3dnowext'/>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </blockers>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </mode>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  </cpu>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <memoryBacking supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <enum name='sourceType'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <value>file</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <value>anonymous</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <value>memfd</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  </memoryBacking>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:  <devices>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:    <disk supported='yes'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='diskDevice'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>disk</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>cdrom</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>floppy</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>lun</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      </enum>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:      <enum name='bus'>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>fdc</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>scsi</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>virtio</value>
Oct 10 06:05:46 np0005479823 nova_compute[235775]:        <value>usb</value>
Oct 10 06:07:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:06.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:06 np0005479823 rsyslogd[1001]: imjournal: 2505 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Oct 10 06:07:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:06 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c004350 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:06 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa848002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:06 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868004250 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:06 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 06:07:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:07.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100707 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 06:07:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:08.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:08 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:08 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c004370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:08 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:09.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:10.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:07:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:10 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868004250 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:10 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:10 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa85c004370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:11.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:12.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:12 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100712 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 06:07:12 np0005479823 podman[236687]: 2025-10-10 10:07:12.804142558 +0000 UTC m=+0.078391422 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 10 06:07:12 np0005479823 podman[236690]: 2025-10-10 10:07:12.836758159 +0000 UTC m=+0.095729625 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=iscsid, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Oct 10 06:07:12 np0005479823 podman[236688]: 2025-10-10 10:07:12.836769329 +0000 UTC m=+0.110507906 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 06:07:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:12 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:12 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868004250 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:13.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:14.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:14 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa868004250 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:14 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa83c003c70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:14 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:07:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:15.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:16.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:16 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:07:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:16 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:16 np0005479823 kernel: ganesha.nfsd[236682]: segfault at 50 ip 00007fa91a7fc32e sp 00007fa8cfffe210 error 4 in libntirpc.so.5.8[7fa91a7e1000+2c000] likely on CPU 1 (core 0, socket 1)
Oct 10 06:07:16 np0005479823 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Oct 10 06:07:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[221545]: 10/10/2025 10:07:16 : epoch 68e8da16 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa838001bd0 fd 48 proxy ignored for local
Oct 10 06:07:16 np0005479823 systemd[1]: Started Process Core Dump (PID 236779/UID 0).
Oct 10 06:07:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:17.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:18 np0005479823 systemd-coredump[236780]: Process 221552 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 63:#012#0  0x00007fa91a7fc32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Oct 10 06:07:18 np0005479823 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 06:07:18 np0005479823 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 06:07:18 np0005479823 systemd[1]: systemd-coredump@7-236779-0.service: Deactivated successfully.
Oct 10 06:07:18 np0005479823 systemd[1]: systemd-coredump@7-236779-0.service: Consumed 1.162s CPU time.
Oct 10 06:07:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:18.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:18 np0005479823 podman[236786]: 2025-10-10 10:07:18.192515476 +0000 UTC m=+0.025101691 container died 4d2a7c9961c6ba6f3729e097fc77944ca60dcabd2592c9e458b8850033b98ec1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 06:07:18 np0005479823 systemd[1]: var-lib-containers-storage-overlay-ed46e148b9f8eeed63d17e29d76b4f2a4e379a6bb6a82713a88a44ea921fa1d2-merged.mount: Deactivated successfully.
Oct 10 06:07:18 np0005479823 podman[236786]: 2025-10-10 10:07:18.229592448 +0000 UTC m=+0.062178643 container remove 4d2a7c9961c6ba6f3729e097fc77944ca60dcabd2592c9e458b8850033b98ec1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 06:07:18 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Main process exited, code=exited, status=139/n/a
Oct 10 06:07:18 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Failed with result 'exit-code'.
Oct 10 06:07:18 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.663s CPU time.
Oct 10 06:07:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:19.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:20.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:07:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:07:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:21.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:07:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:22.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100722 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 06:07:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:23.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:24.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "version", "format": "json"} v 0)
Oct 10 06:07:25 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1743770823' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Oct 10 06:07:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:07:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:25.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:26.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Oct 10 06:07:26 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1938819953' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 06:07:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Oct 10 06:07:26 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1938819953' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 06:07:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:26 np0005479823 podman[236836]: 2025-10-10 10:07:26.801937721 +0000 UTC m=+0.076375389 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct 10 06:07:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:07:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:27.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:07:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100727 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 06:07:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:28.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:28 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Scheduled restart job, restart counter is at 8.
Oct 10 06:07:28 np0005479823 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 06:07:28 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.663s CPU time.
Oct 10 06:07:28 np0005479823 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 06:07:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:28 np0005479823 podman[236908]: 2025-10-10 10:07:28.679181229 +0000 UTC m=+0.050916087 container create 90bb825f5e1d14c7a708252a75b1b4311bb0c31961179dd5584f706db20938c5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 06:07:28 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d83eab439fa5a3e9abff5a44dbe3ed5529a7fd8a5d250f9e424122df311085d4/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Oct 10 06:07:28 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d83eab439fa5a3e9abff5a44dbe3ed5529a7fd8a5d250f9e424122df311085d4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 06:07:28 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d83eab439fa5a3e9abff5a44dbe3ed5529a7fd8a5d250f9e424122df311085d4/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 06:07:28 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d83eab439fa5a3e9abff5a44dbe3ed5529a7fd8a5d250f9e424122df311085d4/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.boccfy-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 06:07:28 np0005479823 podman[236908]: 2025-10-10 10:07:28.659266443 +0000 UTC m=+0.031001321 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 06:07:28 np0005479823 podman[236908]: 2025-10-10 10:07:28.756231898 +0000 UTC m=+0.127966786 container init 90bb825f5e1d14c7a708252a75b1b4311bb0c31961179dd5584f706db20938c5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, ceph=True)
Oct 10 06:07:28 np0005479823 podman[236908]: 2025-10-10 10:07:28.760763254 +0000 UTC m=+0.132498112 container start 90bb825f5e1d14c7a708252a75b1b4311bb0c31961179dd5584f706db20938c5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Oct 10 06:07:28 np0005479823 bash[236908]: 90bb825f5e1d14c7a708252a75b1b4311bb0c31961179dd5584f706db20938c5
Oct 10 06:07:28 np0005479823 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 06:07:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Oct 10 06:07:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Oct 10 06:07:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Oct 10 06:07:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Oct 10 06:07:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Oct 10 06:07:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Oct 10 06:07:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Oct 10 06:07:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:07:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:07:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:29.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:07:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:30.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:07:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:07:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:31.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:07:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:32.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:33.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100733 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 06:07:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:34.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:34 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:07:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:34 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:07:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:34 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:07:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:07:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:35.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:36.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:07:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:37.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:07:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:38.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:38 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:07:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:38 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:07:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:38 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:07:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:39 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:07:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:07:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:39.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:07:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:39 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:07:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:39 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:07:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:39 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:07:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:40.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:07:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:07:41.460 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:07:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:07:41.460 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:07:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:07:41.460 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:07:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:41.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:42.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:43.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:43 np0005479823 podman[237008]: 2025-10-10 10:07:43.793675514 +0000 UTC m=+0.058886481 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 10 06:07:43 np0005479823 podman[237007]: 2025-10-10 10:07:43.802108943 +0000 UTC m=+0.077443993 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 06:07:43 np0005479823 podman[237006]: 2025-10-10 10:07:43.817504344 +0000 UTC m=+0.094034123 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 06:07:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:44.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:07:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:45.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:46.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:46 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d90000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:46 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:46 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:47 np0005479823 nova_compute[235775]: 2025-10-10 10:07:47.040 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:07:47 np0005479823 nova_compute[235775]: 2025-10-10 10:07:47.040 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:07:47 np0005479823 nova_compute[235775]: 2025-10-10 10:07:47.061 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:07:47 np0005479823 nova_compute[235775]: 2025-10-10 10:07:47.061 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct 10 06:07:47 np0005479823 nova_compute[235775]: 2025-10-10 10:07:47.061 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct 10 06:07:47 np0005479823 nova_compute[235775]: 2025-10-10 10:07:47.074 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct 10 06:07:47 np0005479823 nova_compute[235775]: 2025-10-10 10:07:47.074 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:07:47 np0005479823 nova_compute[235775]: 2025-10-10 10:07:47.074 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:07:47 np0005479823 nova_compute[235775]: 2025-10-10 10:07:47.074 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:07:47 np0005479823 nova_compute[235775]: 2025-10-10 10:07:47.074 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:07:47 np0005479823 nova_compute[235775]: 2025-10-10 10:07:47.075 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct 10 06:07:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:47.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:47 np0005479823 nova_compute[235775]: 2025-10-10 10:07:47.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:07:47 np0005479823 nova_compute[235775]: 2025-10-10 10:07:47.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:07:47 np0005479823 nova_compute[235775]: 2025-10-10 10:07:47.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:07:47 np0005479823 nova_compute[235775]: 2025-10-10 10:07:47.834 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:07:47 np0005479823 nova_compute[235775]: 2025-10-10 10:07:47.834 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:07:47 np0005479823 nova_compute[235775]: 2025-10-10 10:07:47.835 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:07:47 np0005479823 nova_compute[235775]: 2025-10-10 10:07:47.835 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct 10 06:07:47 np0005479823 nova_compute[235775]: 2025-10-10 10:07:47.835 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:07:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:48.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:48 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:07:48 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/85312783' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:07:48 np0005479823 nova_compute[235775]: 2025-10-10 10:07:48.288 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:07:48 np0005479823 nova_compute[235775]: 2025-10-10 10:07:48.434 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct 10 06:07:48 np0005479823 nova_compute[235775]: 2025-10-10 10:07:48.435 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5259MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct 10 06:07:48 np0005479823 nova_compute[235775]: 2025-10-10 10:07:48.435 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:07:48 np0005479823 nova_compute[235775]: 2025-10-10 10:07:48.436 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:07:48 np0005479823 nova_compute[235775]: 2025-10-10 10:07:48.491 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct 10 06:07:48 np0005479823 nova_compute[235775]: 2025-10-10 10:07:48.491 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct 10 06:07:48 np0005479823 nova_compute[235775]: 2025-10-10 10:07:48.507 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:07:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:48 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:07:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:48 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:07:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:48 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100748 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 06:07:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:48 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d74000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:48 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:07:48 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4252308067' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:07:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:48 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:48 np0005479823 nova_compute[235775]: 2025-10-10 10:07:48.951 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:07:48 np0005479823 nova_compute[235775]: 2025-10-10 10:07:48.958 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct 10 06:07:48 np0005479823 nova_compute[235775]: 2025-10-10 10:07:48.975 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct 10 06:07:48 np0005479823 nova_compute[235775]: 2025-10-10 10:07:48.978 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct 10 06:07:48 np0005479823 nova_compute[235775]: 2025-10-10 10:07:48.978 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.543s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:07:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:49.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:07:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:50.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:07:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:07:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:50 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:50 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d680016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:50 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d74001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:51.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:51 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 06:07:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:52.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:52 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d74001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:52 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:52 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d680016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:53 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 06:07:53 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:07:53 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:07:53 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 06:07:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:07:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:53.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:07:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100753 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 06:07:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:07:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:54.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:07:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:54 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d74001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:54 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:54 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:07:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:07:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:55.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:07:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:56.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:56 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d680016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:56 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d74001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:56 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:07:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:57.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.548366) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090877548450, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 2368, "num_deletes": 251, "total_data_size": 6249742, "memory_usage": 6358064, "flush_reason": "Manual Compaction"}
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090877575466, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 4069084, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20848, "largest_seqno": 23210, "table_properties": {"data_size": 4059605, "index_size": 5973, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19602, "raw_average_key_size": 20, "raw_value_size": 4040653, "raw_average_value_size": 4165, "num_data_blocks": 262, "num_entries": 970, "num_filter_entries": 970, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760090666, "oldest_key_time": 1760090666, "file_creation_time": 1760090877, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 27163 microseconds, and 15864 cpu microseconds.
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.575541) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 4069084 bytes OK
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.575571) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.577272) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.577296) EVENT_LOG_v1 {"time_micros": 1760090877577288, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.577328) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 6239323, prev total WAL file size 6275848, number of live WAL files 2.
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.580022) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031353036' seq:72057594037927935, type:22 .. '7061786F730031373538' seq:0, type:0; will stop at (end)
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(3973KB)], [39(12MB)]
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090877580077, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 16934462, "oldest_snapshot_seqno": -1}
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 5420 keys, 14721321 bytes, temperature: kUnknown
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090877664661, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 14721321, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14683056, "index_size": 23627, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13573, "raw_key_size": 136693, "raw_average_key_size": 25, "raw_value_size": 14582880, "raw_average_value_size": 2690, "num_data_blocks": 976, "num_entries": 5420, "num_filter_entries": 5420, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760090877, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.664995) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 14721321 bytes
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.666117) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 200.0 rd, 173.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.9, 12.3 +0.0 blob) out(14.0 +0.0 blob), read-write-amplify(7.8) write-amplify(3.6) OK, records in: 5940, records dropped: 520 output_compression: NoCompression
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.666133) EVENT_LOG_v1 {"time_micros": 1760090877666125, "job": 22, "event": "compaction_finished", "compaction_time_micros": 84669, "compaction_time_cpu_micros": 49161, "output_level": 6, "num_output_files": 1, "total_output_size": 14721321, "num_input_records": 5940, "num_output_records": 5420, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090877667002, "job": 22, "event": "table_file_deletion", "file_number": 41}
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090877669172, "job": 22, "event": "table_file_deletion", "file_number": 39}
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.579958) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.669259) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.669264) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.669265) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.669266) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:07:57 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:07:57.669268) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:07:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:57 np0005479823 podman[237251]: 2025-10-10 10:07:57.820737277 +0000 UTC m=+0.078459906 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 10 06:07:58 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:07:58 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:07:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:07:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:07:58.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:07:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:58 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:58 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:07:58 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d74001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:07:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:07:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:07:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:07:59.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:07:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:07:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:07:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:07:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:00.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:08:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:00 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d74001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:00 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:00 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:01.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:02.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:02 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:02 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d74003b20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:02 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:03.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:04.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:04 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:04 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:04 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d74003b20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:08:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:05.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:06.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:06 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:06 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:06 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:07.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:08.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:08 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:08 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:08 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:09.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:10.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:08:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:10 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:10 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:10 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c001c40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:08:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:11.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:08:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:12.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:12 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:12 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:12 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:08:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:13.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:08:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:14.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:14 np0005479823 podman[237314]: 2025-10-10 10:08:14.7789432 +0000 UTC m=+0.054160611 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=iscsid)
Oct 10 06:08:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:14 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c002740 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:14 np0005479823 podman[237312]: 2025-10-10 10:08:14.793650109 +0000 UTC m=+0.067355241 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Oct 10 06:08:14 np0005479823 podman[237313]: 2025-10-10 10:08:14.824657679 +0000 UTC m=+0.102793442 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 10 06:08:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:14 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c002740 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:14 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c002740 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:08:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:15.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:16.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:16 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c002740 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:16 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c002740 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:16 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:17.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:08:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:18.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:08:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:18 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:18 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:18 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:19.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:19 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:08:19.604 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'da:dc:6a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '42:2f:dd:4e:d8:41'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct 10 06:08:19 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:08:19.606 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct 10 06:08:19 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:08:19.607 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:08:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:08:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:20.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:20 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:20 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:20 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:08:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:21.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:08:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:08:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:22.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:08:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:22 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d600016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:22 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:22 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:23.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:24.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:24 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:24 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d600016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:24 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:08:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:08:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:25.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:08:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:26.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:26 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:26 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:26 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d600016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:27.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:28.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:28 np0005479823 podman[237416]: 2025-10-10 10:08:28.779634901 +0000 UTC m=+0.052358072 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 10 06:08:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:29.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:08:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:08:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:30.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:08:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:30 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:30 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:30 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:31.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:32.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:32 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:32 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:32 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c004140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:33.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:34.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:34 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:34 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:34 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:08:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:35.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:36.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:36 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c004160 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:36 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:36 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:37.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:38.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:38 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:38 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c004180 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:39 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:08:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:39.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:08:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:08:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:08:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:40.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:08:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:40 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:40 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:41 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c0041a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:08:41.461 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:08:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:08:41.461 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:08:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:08:41.461 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:08:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:41.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:42.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:42 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:42 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:43 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:08:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:43.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:08:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:44.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:44 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c0041c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:44 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:08:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:45.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:45 np0005479823 podman[237480]: 2025-10-10 10:08:45.779291246 +0000 UTC m=+0.054247044 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001)
Oct 10 06:08:45 np0005479823 podman[237478]: 2025-10-10 10:08:45.784577834 +0000 UTC m=+0.055679658 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 06:08:45 np0005479823 podman[237479]: 2025-10-10 10:08:45.833649411 +0000 UTC m=+0.111134549 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 10 06:08:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:46.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:46 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:46 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c0041e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:46 np0005479823 nova_compute[235775]: 2025-10-10 10:08:46.974 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:08:46 np0005479823 nova_compute[235775]: 2025-10-10 10:08:46.974 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:08:46 np0005479823 nova_compute[235775]: 2025-10-10 10:08:46.974 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct 10 06:08:46 np0005479823 nova_compute[235775]: 2025-10-10 10:08:46.975 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct 10 06:08:46 np0005479823 nova_compute[235775]: 2025-10-10 10:08:46.989 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct 10 06:08:46 np0005479823 nova_compute[235775]: 2025-10-10 10:08:46.989 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:08:46 np0005479823 nova_compute[235775]: 2025-10-10 10:08:46.989 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:08:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:47 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:47.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:47 np0005479823 nova_compute[235775]: 2025-10-10 10:08:47.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:08:47 np0005479823 nova_compute[235775]: 2025-10-10 10:08:47.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:08:47 np0005479823 nova_compute[235775]: 2025-10-10 10:08:47.836 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:08:47 np0005479823 nova_compute[235775]: 2025-10-10 10:08:47.837 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:08:47 np0005479823 nova_compute[235775]: 2025-10-10 10:08:47.837 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:08:47 np0005479823 nova_compute[235775]: 2025-10-10 10:08:47.837 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct 10 06:08:47 np0005479823 nova_compute[235775]: 2025-10-10 10:08:47.837 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:08:48 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:08:48 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1968454522' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:08:48 np0005479823 nova_compute[235775]: 2025-10-10 10:08:48.247 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.410s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:08:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:48.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:48 np0005479823 nova_compute[235775]: 2025-10-10 10:08:48.445 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct 10 06:08:48 np0005479823 nova_compute[235775]: 2025-10-10 10:08:48.446 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5231MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct 10 06:08:48 np0005479823 nova_compute[235775]: 2025-10-10 10:08:48.446 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:08:48 np0005479823 nova_compute[235775]: 2025-10-10 10:08:48.446 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:08:48 np0005479823 nova_compute[235775]: 2025-10-10 10:08:48.502 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct 10 06:08:48 np0005479823 nova_compute[235775]: 2025-10-10 10:08:48.503 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct 10 06:08:48 np0005479823 nova_compute[235775]: 2025-10-10 10:08:48.521 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:08:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:48 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:48 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:48 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:08:48 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4155216318' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:08:48 np0005479823 nova_compute[235775]: 2025-10-10 10:08:48.962 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:08:48 np0005479823 nova_compute[235775]: 2025-10-10 10:08:48.968 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct 10 06:08:48 np0005479823 nova_compute[235775]: 2025-10-10 10:08:48.990 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct 10 06:08:48 np0005479823 nova_compute[235775]: 2025-10-10 10:08:48.993 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct 10 06:08:48 np0005479823 nova_compute[235775]: 2025-10-10 10:08:48.993 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.547s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:08:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:49 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:49.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:49 np0005479823 nova_compute[235775]: 2025-10-10 10:08:49.994 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:08:49 np0005479823 nova_compute[235775]: 2025-10-10 10:08:49.994 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:08:49 np0005479823 nova_compute[235775]: 2025-10-10 10:08:49.995 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:08:49 np0005479823 nova_compute[235775]: 2025-10-10 10:08:49.995 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct 10 06:08:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:08:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:50.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:50 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:50 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:51 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:51.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:52.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:52 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0020f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:52 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:53 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:53.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:54.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:54 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:54 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0020f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:55 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:08:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:55.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:56.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:56 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:56 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:57 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0020f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:57.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100857 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 06:08:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:08:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:08:58.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:08:58 np0005479823 podman[237744]: 2025-10-10 10:08:58.596564214 +0000 UTC m=+0.104186328 container exec bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Oct 10 06:08:58 np0005479823 podman[237744]: 2025-10-10 10:08:58.693384345 +0000 UTC m=+0.201006439 container exec_died bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325)
Oct 10 06:08:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:58 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0020f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:58 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Oct 10 06:08:58 np0005479823 podman[237798]: 2025-10-10 10:08:58.920220628 +0000 UTC m=+0.080977707 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001)
Oct 10 06:08:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:58 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0020f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:08:59 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:08:59 np0005479823 podman[237882]: 2025-10-10 10:08:59.170947872 +0000 UTC m=+0.053358204 container exec 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 10 06:08:59 np0005479823 podman[237882]: 2025-10-10 10:08:59.176058197 +0000 UTC m=+0.058468529 container exec_died 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 10 06:08:59 np0005479823 podman[237974]: 2025-10-10 10:08:59.52447392 +0000 UTC m=+0.064776469 container exec 90bb825f5e1d14c7a708252a75b1b4311bb0c31961179dd5584f706db20938c5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1)
Oct 10 06:08:59 np0005479823 podman[237974]: 2025-10-10 10:08:59.537171266 +0000 UTC m=+0.077473715 container exec_died 90bb825f5e1d14c7a708252a75b1b4311bb0c31961179dd5584f706db20938c5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 10 06:08:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:08:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:08:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:08:59.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:08:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:08:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:08:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:08:59 np0005479823 podman[238036]: 2025-10-10 10:08:59.801375691 +0000 UTC m=+0.061511505 container exec 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0 (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol)
Oct 10 06:08:59 np0005479823 podman[238036]: 2025-10-10 10:08:59.842265698 +0000 UTC m=+0.102401482 container exec_died 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0 (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol)
Oct 10 06:09:00 np0005479823 podman[238102]: 2025-10-10 10:09:00.105280585 +0000 UTC m=+0.076814144 container exec 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm, vcs-type=git, release=1793, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, distribution-scope=public, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64, vendor=Red Hat, Inc., version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Oct 10 06:09:00 np0005479823 podman[238102]: 2025-10-10 10:09:00.160071014 +0000 UTC m=+0.131604563 container exec_died 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, com.redhat.component=keepalived-container, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, distribution-scope=public, architecture=x86_64, release=1793, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.expose-services=, vcs-type=git)
Oct 10 06:09:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:09:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:00.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:00 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0020f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:00 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0020f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:01 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0020f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:01 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:09:01 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:09:01 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:09:01 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:09:01 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Oct 10 06:09:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:01.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:02.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:02 np0005479823 systemd[1]: packagekit.service: Deactivated successfully.
Oct 10 06:09:02 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Oct 10 06:09:02 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 06:09:02 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:09:02 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:09:02 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 06:09:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:02 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0020f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:02 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:03 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:03.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:04.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:04 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:04 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0020f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:05 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:09:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:05.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:06.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:06 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:09:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:06 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:06 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:07 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c004470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:07.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:07 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:09:07 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:09:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:08.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:08 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:08 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:09 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:09.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:09 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:09:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:09 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:09:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:09 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:09:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:09:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:09:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:10.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:09:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:10 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c004470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:10 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:11 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:11.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:12.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:12 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 06:09:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:12 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:12 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c004470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:13 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003c90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:13.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:14.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:14 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:14 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:15 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:09:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:15.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:16.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:16 np0005479823 podman[238317]: 2025-10-10 10:09:16.80155126 +0000 UTC m=+0.075608576 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 06:09:16 np0005479823 podman[238320]: 2025-10-10 10:09:16.816782345 +0000 UTC m=+0.075684517 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct 10 06:09:16 np0005479823 podman[238319]: 2025-10-10 10:09:16.817390825 +0000 UTC m=+0.090538541 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 10 06:09:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:16 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:16 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:17 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:09:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:17.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:09:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100917 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 06:09:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:18.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:18 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:18 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:19 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003cd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:19.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:09:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:20.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:20 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:20 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:21 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:21.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:09:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:22.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:09:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:22 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003d80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:22 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:23 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d74002420 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:23.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:24.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:24 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:24 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003d80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:25 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:09:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:25.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:09:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:26.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:09:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100926 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 06:09:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:26 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d74002420 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:26 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:27 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003d80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:09:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:27.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:09:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:28.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d74002420 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:29 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:09:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:29.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:09:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:29 np0005479823 podman[238396]: 2025-10-10 10:09:29.773472494 +0000 UTC m=+0.052986003 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 10 06:09:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:09:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:30.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:30 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003d80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:30 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:31 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d74002420 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:09:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:31.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:09:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:32.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:32 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:32 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003d80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:33 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:33.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:34.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:34 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:34 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:35 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003da0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:35 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:09:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:09:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:35.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:36.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:36 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:36 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:37 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:37.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:38 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:09:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:38 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:09:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:09:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:38.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:09:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:38 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:38 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:39 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:09:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:39.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:09:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:09:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:40.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:40 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:40 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:41 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003de0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:41 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 06:09:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:09:41.461 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:09:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:09:41.462 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:09:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:09:41.462 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:09:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:41.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:09:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:42.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:09:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:42 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:42 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:43 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:43.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.186656) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090984186690, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 1302, "num_deletes": 250, "total_data_size": 3196741, "memory_usage": 3260136, "flush_reason": "Manual Compaction"}
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090984197237, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 1331552, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23215, "largest_seqno": 24512, "table_properties": {"data_size": 1327081, "index_size": 1995, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 11513, "raw_average_key_size": 20, "raw_value_size": 1317447, "raw_average_value_size": 2340, "num_data_blocks": 86, "num_entries": 563, "num_filter_entries": 563, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760090877, "oldest_key_time": 1760090877, "file_creation_time": 1760090984, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 10610 microseconds, and 3768 cpu microseconds.
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.197267) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 1331552 bytes OK
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.197281) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.200350) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.200366) EVENT_LOG_v1 {"time_micros": 1760090984200360, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.200381) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 3190593, prev total WAL file size 3190593, number of live WAL files 2.
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.201282) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400353033' seq:72057594037927935, type:22 .. '6D67727374617400373534' seq:0, type:0; will stop at (end)
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(1300KB)], [42(14MB)]
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090984201348, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 16052873, "oldest_snapshot_seqno": -1}
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 5512 keys, 12707545 bytes, temperature: kUnknown
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090984276144, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 12707545, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12671786, "index_size": 20865, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13829, "raw_key_size": 138905, "raw_average_key_size": 25, "raw_value_size": 12573186, "raw_average_value_size": 2281, "num_data_blocks": 855, "num_entries": 5512, "num_filter_entries": 5512, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760090984, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.276371) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 12707545 bytes
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.278138) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 214.4 rd, 169.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 14.0 +0.0 blob) out(12.1 +0.0 blob), read-write-amplify(21.6) write-amplify(9.5) OK, records in: 5983, records dropped: 471 output_compression: NoCompression
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.278155) EVENT_LOG_v1 {"time_micros": 1760090984278147, "job": 24, "event": "compaction_finished", "compaction_time_micros": 74858, "compaction_time_cpu_micros": 34977, "output_level": 6, "num_output_files": 1, "total_output_size": 12707545, "num_input_records": 5983, "num_output_records": 5512, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090984278481, "job": 24, "event": "table_file_deletion", "file_number": 44}
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760090984281292, "job": 24, "event": "table_file_deletion", "file_number": 42}
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.201193) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.281326) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.281331) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.281333) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.281335) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:09:44 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:09:44.281337) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:09:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:09:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:44.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:09:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:44 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003e00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:44 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003c90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:45 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:09:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000065s ======
Oct 10 06:09:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:45.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000065s
Oct 10 06:09:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:45 np0005479823 nova_compute[235775]: 2025-10-10 10:09:45.817 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:09:45 np0005479823 nova_compute[235775]: 2025-10-10 10:09:45.817 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct 10 06:09:45 np0005479823 nova_compute[235775]: 2025-10-10 10:09:45.818 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct 10 06:09:45 np0005479823 nova_compute[235775]: 2025-10-10 10:09:45.837 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct 10 06:09:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:46.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=404 latency=0.001000032s ======
Oct 10 06:09:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:46.578 +0000] "GET /healthcheck HTTP/1.1" 404 242 - "python-urllib3/1.26.5" - latency=0.001000032s
Oct 10 06:09:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:46 np0005479823 nova_compute[235775]: 2025-10-10 10:09:46.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:09:46 np0005479823 nova_compute[235775]: 2025-10-10 10:09:46.834 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:09:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100946 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 06:09:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:46 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:46 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003e20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:47 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:09:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:47.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:09:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:47 np0005479823 podman[238458]: 2025-10-10 10:09:47.783503717 +0000 UTC m=+0.055543134 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3)
Oct 10 06:09:47 np0005479823 podman[238460]: 2025-10-10 10:09:47.793894849 +0000 UTC m=+0.056537606 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct 10 06:09:47 np0005479823 nova_compute[235775]: 2025-10-10 10:09:47.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:09:47 np0005479823 nova_compute[235775]: 2025-10-10 10:09:47.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:09:47 np0005479823 nova_compute[235775]: 2025-10-10 10:09:47.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:09:47 np0005479823 podman[238459]: 2025-10-10 10:09:47.86689843 +0000 UTC m=+0.125116566 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, container_name=ovn_controller)
Oct 10 06:09:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:48.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:48 np0005479823 nova_compute[235775]: 2025-10-10 10:09:48.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:09:48 np0005479823 nova_compute[235775]: 2025-10-10 10:09:48.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct 10 06:09:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:48 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60003cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:48 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:49 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:09:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:49.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:09:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:49 np0005479823 nova_compute[235775]: 2025-10-10 10:09:49.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:09:49 np0005479823 nova_compute[235775]: 2025-10-10 10:09:49.844 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:09:49 np0005479823 nova_compute[235775]: 2025-10-10 10:09:49.845 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:09:49 np0005479823 nova_compute[235775]: 2025-10-10 10:09:49.845 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:09:49 np0005479823 nova_compute[235775]: 2025-10-10 10:09:49.845 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct 10 06:09:49 np0005479823 nova_compute[235775]: 2025-10-10 10:09:49.845 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:09:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:09:50 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2285751282' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:09:50 np0005479823 nova_compute[235775]: 2025-10-10 10:09:50.264 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:09:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:09:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:50.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:50 np0005479823 nova_compute[235775]: 2025-10-10 10:09:50.427 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct 10 06:09:50 np0005479823 nova_compute[235775]: 2025-10-10 10:09:50.428 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5238MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct 10 06:09:50 np0005479823 nova_compute[235775]: 2025-10-10 10:09:50.429 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:09:50 np0005479823 nova_compute[235775]: 2025-10-10 10:09:50.429 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:09:50 np0005479823 nova_compute[235775]: 2025-10-10 10:09:50.491 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct 10 06:09:50 np0005479823 nova_compute[235775]: 2025-10-10 10:09:50.491 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct 10 06:09:50 np0005479823 nova_compute[235775]: 2025-10-10 10:09:50.513 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:09:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:50 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:09:50 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2966347030' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:09:50 np0005479823 nova_compute[235775]: 2025-10-10 10:09:50.955 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:09:50 np0005479823 nova_compute[235775]: 2025-10-10 10:09:50.963 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct 10 06:09:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:50 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:50 np0005479823 nova_compute[235775]: 2025-10-10 10:09:50.992 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct 10 06:09:50 np0005479823 nova_compute[235775]: 2025-10-10 10:09:50.995 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct 10 06:09:50 np0005479823 nova_compute[235775]: 2025-10-10 10:09:50.996 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.567s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:09:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:51 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:51 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e144 e144: 3 total, 3 up, 3 in
Oct 10 06:09:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:51.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:52 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e145 e145: 3 total, 3 up, 3 in
Oct 10 06:09:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:52.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:52 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84001080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:52 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:52 np0005479823 nova_compute[235775]: 2025-10-10 10:09:52.996 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:09:52 np0005479823 nova_compute[235775]: 2025-10-10 10:09:52.996 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:09:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:53 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:53 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e146 e146: 3 total, 3 up, 3 in
Oct 10 06:09:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:09:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:53.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:09:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:54 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e147 e147: 3 total, 3 up, 3 in
Oct 10 06:09:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:09:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:54.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:09:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:54 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003e40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:54 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84001080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:55 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:09:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e148 e148: 3 total, 3 up, 3 in
Oct 10 06:09:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:09:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:55.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:09:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:56.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:56 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/100956 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 06:09:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:56 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003e60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:57 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002900 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:09:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:57.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:09:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:09:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:09:58.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:09:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:58 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:58 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:09:59 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:09:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:09:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:09:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:09:59.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:09:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:09:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:09:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:09:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:10:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:00.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:00 np0005479823 podman[238602]: 2025-10-10 10:10:00.797789409 +0000 UTC m=+0.066190333 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 10 06:10:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:00 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84002900 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:00 np0005479823 ceph-mon[74913]: overall HEALTH_OK
Oct 10 06:10:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:00 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:01 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:01.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:02.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:02 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:02 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84003610 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:03 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84003610 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:03.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:04.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:04 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:04 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:05 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:05 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:10:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:10:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:05.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:06.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:06 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84003610 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:06 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:07 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003f20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:07.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:08 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:10:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:08 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:10:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:10:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:08.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:10:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:08 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:08 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84004710 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:09 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:09.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:10:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:10.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:10 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0045f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:10 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:11 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84004710 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:11 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 06:10:11 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:10:11 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:10:11 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 06:10:11 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:10:11 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:10:11 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 06:10:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:11.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:10:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:12.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:10:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:12 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84004710 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:12 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:13 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:10:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:13.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:10:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:14.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:14 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:10:14.636 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'da:dc:6a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '42:2f:dd:4e:d8:41'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct 10 06:10:14 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:10:14.637 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct 10 06:10:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:14 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84004710 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:14 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c004790 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:15 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:10:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:15.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:16.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/101016 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 06:10:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:16 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:16 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84004710 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:17 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0047b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:17 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:10:17 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:10:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:17.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:18.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:18 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:10:18.640 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:10:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:18 np0005479823 podman[238773]: 2025-10-10 10:10:18.824127752 +0000 UTC m=+0.091478243 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 10 06:10:18 np0005479823 podman[238774]: 2025-10-10 10:10:18.828031536 +0000 UTC m=+0.089663514 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 10 06:10:18 np0005479823 podman[238780]: 2025-10-10 10:10:18.84570453 +0000 UTC m=+0.099287931 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 10 06:10:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:18 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:18 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:19 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84004710 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.204600) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091019204635, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 713, "num_deletes": 257, "total_data_size": 1346583, "memory_usage": 1367056, "flush_reason": "Manual Compaction"}
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091019214039, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 870498, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 24517, "largest_seqno": 25225, "table_properties": {"data_size": 867026, "index_size": 1316, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 7925, "raw_average_key_size": 18, "raw_value_size": 859782, "raw_average_value_size": 2008, "num_data_blocks": 58, "num_entries": 428, "num_filter_entries": 428, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760090985, "oldest_key_time": 1760090985, "file_creation_time": 1760091019, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 9496 microseconds, and 5328 cpu microseconds.
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.214091) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 870498 bytes OK
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.214115) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.215920) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.215941) EVENT_LOG_v1 {"time_micros": 1760091019215934, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.215962) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 1342679, prev total WAL file size 1342679, number of live WAL files 2.
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.216857) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00323530' seq:72057594037927935, type:22 .. '6C6F676D00353033' seq:0, type:0; will stop at (end)
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(850KB)], [45(12MB)]
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091019216906, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 13578043, "oldest_snapshot_seqno": -1}
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 5409 keys, 13422995 bytes, temperature: kUnknown
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091019299560, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 13422995, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13386831, "index_size": 21526, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13573, "raw_key_size": 137997, "raw_average_key_size": 25, "raw_value_size": 13288926, "raw_average_value_size": 2456, "num_data_blocks": 879, "num_entries": 5409, "num_filter_entries": 5409, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760091019, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.299867) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 13422995 bytes
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.301079) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 164.1 rd, 162.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 12.1 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(31.0) write-amplify(15.4) OK, records in: 5940, records dropped: 531 output_compression: NoCompression
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.301099) EVENT_LOG_v1 {"time_micros": 1760091019301090, "job": 26, "event": "compaction_finished", "compaction_time_micros": 82728, "compaction_time_cpu_micros": 45431, "output_level": 6, "num_output_files": 1, "total_output_size": 13422995, "num_input_records": 5940, "num_output_records": 5409, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091019301362, "job": 26, "event": "table_file_deletion", "file_number": 47}
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091019304190, "job": 26, "event": "table_file_deletion", "file_number": 45}
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.216770) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.304232) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.304239) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.304242) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.304245) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:10:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:10:19.304248) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:10:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:19.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:10:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:20.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:20 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0047d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:20 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:21 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d68003fe0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:10:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:21.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:10:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:22.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:22 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d84004710 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:22 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0047f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:23 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:10:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:23.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:10:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:24.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:24 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d58000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:24 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:25 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d8c0047f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:10:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:25.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Oct 10 06:10:26 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3994215839' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 06:10:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Oct 10 06:10:26 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3994215839' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 06:10:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:26.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:26 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:26 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d580016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:27 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:27.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:28.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d6c001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:28 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d60001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:29 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d580016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:10:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:29.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:10:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:10:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:10:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:30.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:10:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:30 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d580016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:30 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d580016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:31 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d580016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:31.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:31 np0005479823 podman[238855]: 2025-10-10 10:10:31.79702922 +0000 UTC m=+0.065949087 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct 10 06:10:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:32.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[236924]: 10/10/2025 10:10:32 : epoch 68e8dae0 : compute-2 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9d740032b0 fd 39 proxy ignored for local
Oct 10 06:10:32 np0005479823 kernel: ganesha.nfsd[238389]: segfault at 50 ip 00007f9e3fab232e sp 00007f9e0cff8210 error 4 in libntirpc.so.5.8[7f9e3fa97000+2c000] likely on CPU 7 (core 0, socket 7)
Oct 10 06:10:32 np0005479823 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Oct 10 06:10:32 np0005479823 systemd[1]: Started Process Core Dump (PID 238879/UID 0).
Oct 10 06:10:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:33.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:34 np0005479823 systemd-coredump[238880]: Process 236928 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 60:#012#0  0x00007f9e3fab232e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Oct 10 06:10:34 np0005479823 systemd[1]: systemd-coredump@8-238879-0.service: Deactivated successfully.
Oct 10 06:10:34 np0005479823 systemd[1]: systemd-coredump@8-238879-0.service: Consumed 1.172s CPU time.
Oct 10 06:10:34 np0005479823 podman[238885]: 2025-10-10 10:10:34.214515272 +0000 UTC m=+0.021911511 container died 90bb825f5e1d14c7a708252a75b1b4311bb0c31961179dd5584f706db20938c5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 06:10:34 np0005479823 systemd[1]: var-lib-containers-storage-overlay-d83eab439fa5a3e9abff5a44dbe3ed5529a7fd8a5d250f9e424122df311085d4-merged.mount: Deactivated successfully.
Oct 10 06:10:34 np0005479823 podman[238885]: 2025-10-10 10:10:34.253969602 +0000 UTC m=+0.061365821 container remove 90bb825f5e1d14c7a708252a75b1b4311bb0c31961179dd5584f706db20938c5 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Oct 10 06:10:34 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Main process exited, code=exited, status=139/n/a
Oct 10 06:10:34 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Failed with result 'exit-code'.
Oct 10 06:10:34 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.754s CPU time.
Oct 10 06:10:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:10:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:34.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:10:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:10:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:10:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:35.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:10:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e149 e149: 3 total, 3 up, 3 in
Oct 10 06:10:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:10:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:36.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:10:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:36 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e150 e150: 3 total, 3 up, 3 in
Oct 10 06:10:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:37.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:10:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:38.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:10:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/101038 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 06:10:39 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 e151: 3 total, 3 up, 3 in
Oct 10 06:10:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:39.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:10:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:40.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:10:41.463 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:10:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:10:41.463 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:10:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:10:41.463 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:10:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:41.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:42.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:10:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:43.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:10:44 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Scheduled restart job, restart counter is at 9.
Oct 10 06:10:44 np0005479823 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 06:10:44 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.754s CPU time.
Oct 10 06:10:44 np0005479823 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 06:10:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:10:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:44.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:10:44 np0005479823 podman[239015]: 2025-10-10 10:10:44.709954275 +0000 UTC m=+0.042750726 container create 7dae9cbe8c9218f23caf45eec31fe766f4d661e9a30c0147374f6aedc1c47a31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 06:10:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:44 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88875b0b4a98559c29846cc1430ee2bc98721cecb6b6468b744e1d314bff0520/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Oct 10 06:10:44 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88875b0b4a98559c29846cc1430ee2bc98721cecb6b6468b744e1d314bff0520/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 06:10:44 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88875b0b4a98559c29846cc1430ee2bc98721cecb6b6468b744e1d314bff0520/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 06:10:44 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88875b0b4a98559c29846cc1430ee2bc98721cecb6b6468b744e1d314bff0520/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.boccfy-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 06:10:44 np0005479823 podman[239015]: 2025-10-10 10:10:44.771442138 +0000 UTC m=+0.104238599 container init 7dae9cbe8c9218f23caf45eec31fe766f4d661e9a30c0147374f6aedc1c47a31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Oct 10 06:10:44 np0005479823 podman[239015]: 2025-10-10 10:10:44.782658696 +0000 UTC m=+0.115455117 container start 7dae9cbe8c9218f23caf45eec31fe766f4d661e9a30c0147374f6aedc1c47a31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Oct 10 06:10:44 np0005479823 podman[239015]: 2025-10-10 10:10:44.689598505 +0000 UTC m=+0.022394946 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 06:10:44 np0005479823 bash[239015]: 7dae9cbe8c9218f23caf45eec31fe766f4d661e9a30c0147374f6aedc1c47a31
Oct 10 06:10:44 np0005479823 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 06:10:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:44 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Oct 10 06:10:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:44 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Oct 10 06:10:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:44 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Oct 10 06:10:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:44 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Oct 10 06:10:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:44 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Oct 10 06:10:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:44 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Oct 10 06:10:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:44 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Oct 10 06:10:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:44 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:10:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:10:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000065s ======
Oct 10 06:10:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:45.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000065s
Oct 10 06:10:45 np0005479823 nova_compute[235775]: 2025-10-10 10:10:45.816 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:10:45 np0005479823 nova_compute[235775]: 2025-10-10 10:10:45.816 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct 10 06:10:45 np0005479823 nova_compute[235775]: 2025-10-10 10:10:45.816 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct 10 06:10:45 np0005479823 nova_compute[235775]: 2025-10-10 10:10:45.842 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct 10 06:10:45 np0005479823 nova_compute[235775]: 2025-10-10 10:10:45.842 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:10:45 np0005479823 nova_compute[235775]: 2025-10-10 10:10:45.842 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct 10 06:10:45 np0005479823 nova_compute[235775]: 2025-10-10 10:10:45.863 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct 10 06:10:45 np0005479823 nova_compute[235775]: 2025-10-10 10:10:45.864 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:10:45 np0005479823 nova_compute[235775]: 2025-10-10 10:10:45.864 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct 10 06:10:45 np0005479823 nova_compute[235775]: 2025-10-10 10:10:45.877 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:10:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:46.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:47.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:47 np0005479823 nova_compute[235775]: 2025-10-10 10:10:47.861 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:10:47 np0005479823 nova_compute[235775]: 2025-10-10 10:10:47.861 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:10:47 np0005479823 nova_compute[235775]: 2025-10-10 10:10:47.861 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:10:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:48.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:49.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:49 np0005479823 podman[239080]: 2025-10-10 10:10:49.786853122 +0000 UTC m=+0.056515625 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 06:10:49 np0005479823 podman[239078]: 2025-10-10 10:10:49.803913227 +0000 UTC m=+0.083464047 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd)
Oct 10 06:10:49 np0005479823 nova_compute[235775]: 2025-10-10 10:10:49.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:10:49 np0005479823 nova_compute[235775]: 2025-10-10 10:10:49.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:10:49 np0005479823 nova_compute[235775]: 2025-10-10 10:10:49.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct 10 06:10:49 np0005479823 podman[239079]: 2025-10-10 10:10:49.834768592 +0000 UTC m=+0.109954482 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 10 06:10:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:10:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:50.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:50 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:10:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:50 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:10:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:51.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:51 np0005479823 nova_compute[235775]: 2025-10-10 10:10:51.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:10:51 np0005479823 nova_compute[235775]: 2025-10-10 10:10:51.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:10:51 np0005479823 nova_compute[235775]: 2025-10-10 10:10:51.934 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:10:51 np0005479823 nova_compute[235775]: 2025-10-10 10:10:51.935 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:10:51 np0005479823 nova_compute[235775]: 2025-10-10 10:10:51.935 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:10:51 np0005479823 nova_compute[235775]: 2025-10-10 10:10:51.936 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct 10 06:10:51 np0005479823 nova_compute[235775]: 2025-10-10 10:10:51.936 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:10:52 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:10:52 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1333009736' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:10:52 np0005479823 nova_compute[235775]: 2025-10-10 10:10:52.406 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:10:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:10:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:52.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:10:52 np0005479823 nova_compute[235775]: 2025-10-10 10:10:52.546 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct 10 06:10:52 np0005479823 nova_compute[235775]: 2025-10-10 10:10:52.548 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5221MB free_disk=59.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct 10 06:10:52 np0005479823 nova_compute[235775]: 2025-10-10 10:10:52.548 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:10:52 np0005479823 nova_compute[235775]: 2025-10-10 10:10:52.548 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:10:52 np0005479823 nova_compute[235775]: 2025-10-10 10:10:52.661 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct 10 06:10:52 np0005479823 nova_compute[235775]: 2025-10-10 10:10:52.661 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct 10 06:10:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:52 np0005479823 nova_compute[235775]: 2025-10-10 10:10:52.745 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Refreshing inventories for resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct 10 06:10:52 np0005479823 nova_compute[235775]: 2025-10-10 10:10:52.769 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Updating ProviderTree inventory for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct 10 06:10:52 np0005479823 nova_compute[235775]: 2025-10-10 10:10:52.769 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Updating inventory in ProviderTree for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct 10 06:10:52 np0005479823 nova_compute[235775]: 2025-10-10 10:10:52.867 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Refreshing aggregate associations for resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct 10 06:10:52 np0005479823 nova_compute[235775]: 2025-10-10 10:10:52.893 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Refreshing trait associations for resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0, traits: HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_FMA3,HW_CPU_X86_SVM,HW_CPU_X86_F16C,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_CLMUL,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_BMI2,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct 10 06:10:52 np0005479823 nova_compute[235775]: 2025-10-10 10:10:52.919 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:10:53 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:10:53 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/547334843' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:10:53 np0005479823 nova_compute[235775]: 2025-10-10 10:10:53.338 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:10:53 np0005479823 nova_compute[235775]: 2025-10-10 10:10:53.342 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct 10 06:10:53 np0005479823 nova_compute[235775]: 2025-10-10 10:10:53.360 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct 10 06:10:53 np0005479823 nova_compute[235775]: 2025-10-10 10:10:53.361 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct 10 06:10:53 np0005479823 nova_compute[235775]: 2025-10-10 10:10:53.361 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.813s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:10:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:53.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:54 np0005479823 nova_compute[235775]: 2025-10-10 10:10:54.361 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:10:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:54.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:10:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:55.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:56.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 06:10:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Oct 10 06:10:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Oct 10 06:10:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Oct 10 06:10:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Oct 10 06:10:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Oct 10 06:10:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Oct 10 06:10:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 06:10:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 06:10:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 06:10:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Oct 10 06:10:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 06:10:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Oct 10 06:10:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Oct 10 06:10:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:56 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb4000df0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6ca40016c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Oct 10 06:10:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Oct 10 06:10:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Oct 10 06:10:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Oct 10 06:10:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Oct 10 06:10:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Oct 10 06:10:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Oct 10 06:10:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Oct 10 06:10:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Oct 10 06:10:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Oct 10 06:10:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 06:10:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Oct 10 06:10:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 06:10:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c90000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:57.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:10:57 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Oct 10 06:10:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:10:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:10:58.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:10:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:58 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/101059 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 06:10:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:59 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:10:59 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:10:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:10:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:10:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:10:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:10:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:10:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:10:59.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:11:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:00.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:00 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c900016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:01 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:01 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c900016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:01.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:02.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:02 np0005479823 podman[239240]: 2025-10-10 10:11:02.809190701 +0000 UTC m=+0.074052165 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 06:11:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:02 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6ca4001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:03 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c001f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:03 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c001f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:03.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:04.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:04 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c001f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:05 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb4000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:05 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c900016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:11:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:11:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:05.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:11:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:06.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:06 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c900016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:07 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c001f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:07 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb4000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:11:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:07.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:11:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:11:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:08.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:11:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:08 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c900016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:09 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6ca4001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:09 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:11:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:09.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:11:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:11:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:10.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:10 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb4000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:11 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c900016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:11 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6ca4001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:11.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:12.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:12 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:13 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb40091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:13 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c900036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:11:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:13.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:11:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:14.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:14 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6ca4001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:15 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:15 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb40091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:11:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:11:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:15.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:11:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:16.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:16 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c900036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:17 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6ca4001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:17 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:17.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:18 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 06:11:18 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:11:18 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:11:18 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 06:11:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:11:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:18.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:11:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:18 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:19 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:19 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6ca4001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.255391) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091079255431, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 883, "num_deletes": 251, "total_data_size": 1687246, "memory_usage": 1713376, "flush_reason": "Manual Compaction"}
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091079263121, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 1112996, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25230, "largest_seqno": 26108, "table_properties": {"data_size": 1108996, "index_size": 1716, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 9395, "raw_average_key_size": 19, "raw_value_size": 1100701, "raw_average_value_size": 2307, "num_data_blocks": 77, "num_entries": 477, "num_filter_entries": 477, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760091020, "oldest_key_time": 1760091020, "file_creation_time": 1760091079, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 7799 microseconds, and 3684 cpu microseconds.
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.263183) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 1112996 bytes OK
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.263209) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.264754) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.264808) EVENT_LOG_v1 {"time_micros": 1760091079264797, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.264889) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 1682734, prev total WAL file size 1682734, number of live WAL files 2.
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.265874) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(1086KB)], [48(12MB)]
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091079265950, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 14535991, "oldest_snapshot_seqno": -1}
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 5368 keys, 12453248 bytes, temperature: kUnknown
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091079337523, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 12453248, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12418205, "index_size": 20533, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13445, "raw_key_size": 137880, "raw_average_key_size": 25, "raw_value_size": 12321652, "raw_average_value_size": 2295, "num_data_blocks": 834, "num_entries": 5368, "num_filter_entries": 5368, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760091079, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.337779) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 12453248 bytes
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.339672) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 202.9 rd, 173.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 12.8 +0.0 blob) out(11.9 +0.0 blob), read-write-amplify(24.2) write-amplify(11.2) OK, records in: 5886, records dropped: 518 output_compression: NoCompression
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.339694) EVENT_LOG_v1 {"time_micros": 1760091079339685, "job": 28, "event": "compaction_finished", "compaction_time_micros": 71655, "compaction_time_cpu_micros": 25533, "output_level": 6, "num_output_files": 1, "total_output_size": 12453248, "num_input_records": 5886, "num_output_records": 5368, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091079340090, "job": 28, "event": "table_file_deletion", "file_number": 50}
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091079342653, "job": 28, "event": "table_file_deletion", "file_number": 48}
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.265731) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.342799) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.342807) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.342810) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.342813) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:11:19 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:11:19.342818) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:11:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:19.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:11:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:20.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:20 np0005479823 podman[239383]: 2025-10-10 10:11:20.809181933 +0000 UTC m=+0.077000130 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 06:11:20 np0005479823 podman[239385]: 2025-10-10 10:11:20.834972135 +0000 UTC m=+0.089661263 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid)
Oct 10 06:11:20 np0005479823 podman[239384]: 2025-10-10 10:11:20.836944649 +0000 UTC m=+0.099331313 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 10 06:11:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:20 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c900036e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:21 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb4009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:21 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:11:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:21.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:11:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:11:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:22.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:11:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:22 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:11:22.750 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'da:dc:6a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '42:2f:dd:4e:d8:41'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct 10 06:11:22 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:11:22.751 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct 10 06:11:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:22 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6ca4001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:23 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6ca4001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:23 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb4009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:23 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:11:23 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:11:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:23.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:24.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:24 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:25 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6ca4001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:25 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6ca4001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:11:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000064s ======
Oct 10 06:11:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:25.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000064s
Oct 10 06:11:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:26.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:26 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:11:26.753 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:11:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:26 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb4009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:27 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb4009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:27 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6ca4001fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:11:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:27.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:11:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/101128 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 06:11:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:28.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:28 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c90003700 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:29 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:29 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:29.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:11:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:11:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:30.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:11:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:30 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb4009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:31 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c90003720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:31 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:31.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:32.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:32 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:33 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb4009ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:33 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c90003740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:33 np0005479823 podman[239485]: 2025-10-10 10:11:33.77730261 +0000 UTC m=+0.055322437 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 06:11:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:33.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:34.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:34 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:35 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:35 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c000f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:11:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:11:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:35.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:11:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:36.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:36 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c900037f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:37 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:37 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:11:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:37 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:37.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:11:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:38.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:11:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:38 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c0010d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:39 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c90003810 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:39 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:39.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:40 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:11:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:40 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:11:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:11:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:40.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:40 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:41 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:41 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c001250 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:11:41.464 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:11:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:11:41.464 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:11:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:11:41.464 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:11:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:41.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:42.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:42 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:43 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:43 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 06:11:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:43 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:43.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:11:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:44.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:11:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:44 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c002410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:45 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c002410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:45 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:11:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:45 np0005479823 nova_compute[235775]: 2025-10-10 10:11:45.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:11:45 np0005479823 nova_compute[235775]: 2025-10-10 10:11:45.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct 10 06:11:45 np0005479823 nova_compute[235775]: 2025-10-10 10:11:45.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct 10 06:11:45 np0005479823 nova_compute[235775]: 2025-10-10 10:11:45.828 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct 10 06:11:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:11:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:45.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:11:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:11:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:46.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:11:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:46 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c002410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:47 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:47 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c840016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:47 np0005479823 nova_compute[235775]: 2025-10-10 10:11:47.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:11:47 np0005479823 nova_compute[235775]: 2025-10-10 10:11:47.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:11:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:47.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:48.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:48 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:49 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:49 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c002410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:49 np0005479823 nova_compute[235775]: 2025-10-10 10:11:49.809 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:11:49 np0005479823 nova_compute[235775]: 2025-10-10 10:11:49.828 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:11:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:49.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/101150 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 06:11:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:11:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:50.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:50 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84002050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:51 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:51 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c0040b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:51 np0005479823 podman[239550]: 2025-10-10 10:11:51.78905866 +0000 UTC m=+0.065388249 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251001)
Oct 10 06:11:51 np0005479823 podman[239552]: 2025-10-10 10:11:51.795614439 +0000 UTC m=+0.064943024 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct 10 06:11:51 np0005479823 nova_compute[235775]: 2025-10-10 10:11:51.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:11:51 np0005479823 nova_compute[235775]: 2025-10-10 10:11:51.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:11:51 np0005479823 nova_compute[235775]: 2025-10-10 10:11:51.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:11:51 np0005479823 nova_compute[235775]: 2025-10-10 10:11:51.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct 10 06:11:51 np0005479823 podman[239551]: 2025-10-10 10:11:51.818551401 +0000 UTC m=+0.091890564 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Oct 10 06:11:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:51.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:52.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:52 np0005479823 nova_compute[235775]: 2025-10-10 10:11:52.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:11:52 np0005479823 nova_compute[235775]: 2025-10-10 10:11:52.843 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:11:52 np0005479823 nova_compute[235775]: 2025-10-10 10:11:52.843 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:11:52 np0005479823 nova_compute[235775]: 2025-10-10 10:11:52.844 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:11:52 np0005479823 nova_compute[235775]: 2025-10-10 10:11:52.844 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct 10 06:11:52 np0005479823 nova_compute[235775]: 2025-10-10 10:11:52.844 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:11:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:52 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c003900 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:53 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84002050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:53 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:53 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:11:53 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3217553215' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:11:53 np0005479823 nova_compute[235775]: 2025-10-10 10:11:53.261 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.417s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:11:53 np0005479823 nova_compute[235775]: 2025-10-10 10:11:53.413 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct 10 06:11:53 np0005479823 nova_compute[235775]: 2025-10-10 10:11:53.414 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5186MB free_disk=59.89714813232422GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct 10 06:11:53 np0005479823 nova_compute[235775]: 2025-10-10 10:11:53.414 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:11:53 np0005479823 nova_compute[235775]: 2025-10-10 10:11:53.415 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:11:53 np0005479823 nova_compute[235775]: 2025-10-10 10:11:53.478 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct 10 06:11:53 np0005479823 nova_compute[235775]: 2025-10-10 10:11:53.479 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct 10 06:11:53 np0005479823 nova_compute[235775]: 2025-10-10 10:11:53.498 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:11:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:53.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:53 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:11:53 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3655824811' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:11:53 np0005479823 nova_compute[235775]: 2025-10-10 10:11:53.905 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.407s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:11:53 np0005479823 nova_compute[235775]: 2025-10-10 10:11:53.912 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct 10 06:11:53 np0005479823 nova_compute[235775]: 2025-10-10 10:11:53.956 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct 10 06:11:53 np0005479823 nova_compute[235775]: 2025-10-10 10:11:53.957 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct 10 06:11:53 np0005479823 nova_compute[235775]: 2025-10-10 10:11:53.958 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.543s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:11:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:54.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:54 np0005479823 nova_compute[235775]: 2025-10-10 10:11:54.958 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:11:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:55 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c0040d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:55 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c003900 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:55 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:11:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:55.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:56.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c0040f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:57 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c003aa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:57.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:11:58.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:11:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:59 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:59 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:11:59 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004110 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:11:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:11:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:11:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:11:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:11:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:11:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:11:59.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:12:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:12:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:00.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:12:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:01 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c003c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:01 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:01 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:01.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:02.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:03 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004110 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:03 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c003c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:03 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:03.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:04.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:04 np0005479823 podman[239694]: 2025-10-10 10:12:04.788519057 +0000 UTC m=+0.052886120 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 10 06:12:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:05 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:05 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004130 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:05 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c003c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:12:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:05.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:12:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:06.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:12:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:07 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:07 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:07 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004150 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:07.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:12:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:08.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:12:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:09 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c003c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:09 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:09 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:09.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:12:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:10.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:11 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004170 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:11 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c8c003c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:11 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:11.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:12:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:12.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:12:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:13 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c7c003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:13 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004190 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:13 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c90001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:13.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:14.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:15 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:15 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb4001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:15 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004190 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:12:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:15.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:16.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:17 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004190 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:17 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:17 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb4001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:17.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:18.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:19 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c90001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:19 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004190 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:19 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:12:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:19.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:12:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:12:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:20.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:21 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb4001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:21 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c90001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:21 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004190 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:21.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:22.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:22 np0005479823 podman[239784]: 2025-10-10 10:12:22.798953971 +0000 UTC m=+0.069044175 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 06:12:22 np0005479823 podman[239786]: 2025-10-10 10:12:22.814811327 +0000 UTC m=+0.075966685 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 06:12:22 np0005479823 podman[239785]: 2025-10-10 10:12:22.827699579 +0000 UTC m=+0.090431498 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 10 06:12:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:23 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004190 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:23 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb40089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:23 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c900029d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:23.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:24 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 06:12:24 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:12:24 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:12:24 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 06:12:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:12:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:24.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:12:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:25 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:25 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004190 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:25 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb40089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:12:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:12:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:25.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:12:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:26.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:27 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c900029d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:27 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c84004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:27 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c9c004190 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:27.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:28.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:29 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb40089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:29 np0005479823 kernel: ganesha.nfsd[239724]: segfault at 50 ip 00007f6d606f432e sp 00007f6d24ff8210 error 4 in libntirpc.so.5.8[7f6d606d9000+2c000] likely on CPU 5 (core 0, socket 5)
Oct 10 06:12:29 np0005479823 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Oct 10 06:12:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[239031]: 10/10/2025 10:12:29 : epoch 68e8dba4 : compute-2 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6cb40089d0 fd 38 proxy ignored for local
Oct 10 06:12:29 np0005479823 systemd[1]: Started Process Core Dump (PID 239936/UID 0).
Oct 10 06:12:29 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:12:29 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:12:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:29.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:30 np0005479823 systemd-coredump[239937]: Process 239035 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 59:#012#0  0x00007f6d606f432e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Oct 10 06:12:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:12:30 np0005479823 systemd[1]: systemd-coredump@9-239936-0.service: Deactivated successfully.
Oct 10 06:12:30 np0005479823 systemd[1]: systemd-coredump@9-239936-0.service: Consumed 1.188s CPU time.
Oct 10 06:12:30 np0005479823 podman[239942]: 2025-10-10 10:12:30.373640986 +0000 UTC m=+0.026125995 container died 7dae9cbe8c9218f23caf45eec31fe766f4d661e9a30c0147374f6aedc1c47a31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True)
Oct 10 06:12:30 np0005479823 systemd[1]: var-lib-containers-storage-overlay-88875b0b4a98559c29846cc1430ee2bc98721cecb6b6468b744e1d314bff0520-merged.mount: Deactivated successfully.
Oct 10 06:12:30 np0005479823 podman[239942]: 2025-10-10 10:12:30.41790686 +0000 UTC m=+0.070391899 container remove 7dae9cbe8c9218f23caf45eec31fe766f4d661e9a30c0147374f6aedc1c47a31 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Oct 10 06:12:30 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Main process exited, code=exited, status=139/n/a
Oct 10 06:12:30 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Failed with result 'exit-code'.
Oct 10 06:12:30 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.658s CPU time.
Oct 10 06:12:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:12:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:30.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:12:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:31.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:12:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:32.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:12:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:12:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:33.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:12:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:12:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:34.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:12:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/101235 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 06:12:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:12:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:35 np0005479823 podman[239993]: 2025-10-10 10:12:35.776707817 +0000 UTC m=+0.049403788 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_metadata_agent)
Oct 10 06:12:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:12:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:35.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:12:36 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:12:36.013 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'da:dc:6a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '42:2f:dd:4e:d8:41'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct 10 06:12:36 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:12:36.014 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct 10 06:12:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:12:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:36.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:12:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:37.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:38.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:39 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:12:39.016 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:12:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:39.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:12:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:40.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:40 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Scheduled restart job, restart counter is at 10.
Oct 10 06:12:40 np0005479823 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 06:12:40 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.658s CPU time.
Oct 10 06:12:40 np0005479823 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 06:12:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:40 np0005479823 podman[240092]: 2025-10-10 10:12:40.898085695 +0000 UTC m=+0.040313868 container create a58146ebeac6f3e96fc00a3976530225da289dc1668a7392f1356cfc5bf92a24 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Oct 10 06:12:40 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3151ba89ddb65e49d6e0646af8407c3119e5a0ce05c4a2b5e5272229a03a14f/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Oct 10 06:12:40 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3151ba89ddb65e49d6e0646af8407c3119e5a0ce05c4a2b5e5272229a03a14f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 06:12:40 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3151ba89ddb65e49d6e0646af8407c3119e5a0ce05c4a2b5e5272229a03a14f/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 06:12:40 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3151ba89ddb65e49d6e0646af8407c3119e5a0ce05c4a2b5e5272229a03a14f/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.boccfy-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 06:12:40 np0005479823 podman[240092]: 2025-10-10 10:12:40.950058765 +0000 UTC m=+0.092286958 container init a58146ebeac6f3e96fc00a3976530225da289dc1668a7392f1356cfc5bf92a24 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 06:12:40 np0005479823 podman[240092]: 2025-10-10 10:12:40.956399627 +0000 UTC m=+0.098627810 container start a58146ebeac6f3e96fc00a3976530225da289dc1668a7392f1356cfc5bf92a24 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True)
Oct 10 06:12:40 np0005479823 bash[240092]: a58146ebeac6f3e96fc00a3976530225da289dc1668a7392f1356cfc5bf92a24
Oct 10 06:12:40 np0005479823 podman[240092]: 2025-10-10 10:12:40.879578545 +0000 UTC m=+0.021806748 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 06:12:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:40 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Oct 10 06:12:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:40 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Oct 10 06:12:40 np0005479823 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 06:12:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:41 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Oct 10 06:12:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:41 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Oct 10 06:12:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:41 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Oct 10 06:12:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:41 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Oct 10 06:12:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:41 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Oct 10 06:12:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:41 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:12:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:12:41.465 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:12:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:12:41.465 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:12:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:12:41.465 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:12:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:12:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:41.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:12:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:12:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:42.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:12:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:43.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:44.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:12:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:45.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:46.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:47 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:12:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:47 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:12:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:47 np0005479823 nova_compute[235775]: 2025-10-10 10:12:47.809 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:12:47 np0005479823 nova_compute[235775]: 2025-10-10 10:12:47.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:12:47 np0005479823 nova_compute[235775]: 2025-10-10 10:12:47.813 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct 10 06:12:47 np0005479823 nova_compute[235775]: 2025-10-10 10:12:47.813 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct 10 06:12:47 np0005479823 nova_compute[235775]: 2025-10-10 10:12:47.836 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct 10 06:12:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:47.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:12:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:48.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:12:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:49 np0005479823 nova_compute[235775]: 2025-10-10 10:12:49.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:12:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:49.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:12:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:50.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:50 np0005479823 nova_compute[235775]: 2025-10-10 10:12:50.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:12:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:51 np0005479823 nova_compute[235775]: 2025-10-10 10:12:51.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:12:51 np0005479823 nova_compute[235775]: 2025-10-10 10:12:51.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:12:51 np0005479823 nova_compute[235775]: 2025-10-10 10:12:51.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct 10 06:12:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:51.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:12:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:52.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:12:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:52 np0005479823 nova_compute[235775]: 2025-10-10 10:12:52.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:12:52 np0005479823 nova_compute[235775]: 2025-10-10 10:12:52.816 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:12:52 np0005479823 nova_compute[235775]: 2025-10-10 10:12:52.841 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:12:52 np0005479823 nova_compute[235775]: 2025-10-10 10:12:52.841 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:12:52 np0005479823 nova_compute[235775]: 2025-10-10 10:12:52.841 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:12:52 np0005479823 nova_compute[235775]: 2025-10-10 10:12:52.842 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct 10 06:12:52 np0005479823 nova_compute[235775]: 2025-10-10 10:12:52.842 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:53 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbce4000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:53 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:12:53 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4031376606' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:12:53 np0005479823 nova_compute[235775]: 2025-10-10 10:12:53.305 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:12:53 np0005479823 nova_compute[235775]: 2025-10-10 10:12:53.432 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct 10 06:12:53 np0005479823 nova_compute[235775]: 2025-10-10 10:12:53.433 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5209MB free_disk=59.94288635253906GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct 10 06:12:53 np0005479823 nova_compute[235775]: 2025-10-10 10:12:53.433 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:12:53 np0005479823 nova_compute[235775]: 2025-10-10 10:12:53.434 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:12:53 np0005479823 nova_compute[235775]: 2025-10-10 10:12:53.508 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct 10 06:12:53 np0005479823 nova_compute[235775]: 2025-10-10 10:12:53.508 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct 10 06:12:53 np0005479823 nova_compute[235775]: 2025-10-10 10:12:53.525 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:53 np0005479823 podman[240219]: 2025-10-10 10:12:53.771224183 +0000 UTC m=+0.051672931 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2)
Oct 10 06:12:53 np0005479823 podman[240221]: 2025-10-10 10:12:53.771660087 +0000 UTC m=+0.047843189 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 06:12:53 np0005479823 podman[240220]: 2025-10-10 10:12:53.798510584 +0000 UTC m=+0.074594882 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 10 06:12:53 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:12:53 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1943454956' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:12:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:53.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:53 np0005479823 nova_compute[235775]: 2025-10-10 10:12:53.940 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:12:53 np0005479823 nova_compute[235775]: 2025-10-10 10:12:53.945 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct 10 06:12:53 np0005479823 nova_compute[235775]: 2025-10-10 10:12:53.967 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct 10 06:12:53 np0005479823 nova_compute[235775]: 2025-10-10 10:12:53.969 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct 10 06:12:53 np0005479823 nova_compute[235775]: 2025-10-10 10:12:53.969 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.535s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:12:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:54.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:54 np0005479823 nova_compute[235775]: 2025-10-10 10:12:54.968 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:12:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:55 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbce0001c40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:55 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbcbc000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:55 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbcb8000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:12:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:55.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:56.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:57 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbcc4000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/101257 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 06:12:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:57 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbce0001c40 fd 49 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:57 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbcbc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:57.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:12:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:12:58.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:12:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:59 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbcc4000fa0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:59 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbcb80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:12:59 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbcb0000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:12:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:12:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:12:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:12:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:12:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:12:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:12:59.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:13:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:13:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:00.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:13:01 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbcb80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:13:01 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbcc40020e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:13:01 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbcbc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:01.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:02.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:13:03 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbcb00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:13:03 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbcb80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:13:03 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbcb80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:13:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:03.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:13:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:04.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240107]: 10/10/2025 10:13:05 : epoch 68e8dc18 : compute-2 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fbcbc0016a0 fd 39 proxy ignored for local
Oct 10 06:13:05 np0005479823 kernel: ganesha.nfsd[240196]: segfault at 50 ip 00007fbd94a5932e sp 00007fbd517f9210 error 4 in libntirpc.so.5.8[7fbd94a3e000+2c000] likely on CPU 4 (core 0, socket 4)
Oct 10 06:13:05 np0005479823 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Oct 10 06:13:05 np0005479823 systemd[1]: Started Process Core Dump (PID 240323/UID 0).
Oct 10 06:13:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:13:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:13:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:05.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:13:06 np0005479823 systemd-coredump[240324]: Process 240111 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 55:#012#0  0x00007fbd94a5932e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Oct 10 06:13:06 np0005479823 systemd[1]: systemd-coredump@10-240323-0.service: Deactivated successfully.
Oct 10 06:13:06 np0005479823 systemd[1]: systemd-coredump@10-240323-0.service: Consumed 1.196s CPU time.
Oct 10 06:13:06 np0005479823 podman[240330]: 2025-10-10 10:13:06.394991156 +0000 UTC m=+0.025346221 container died a58146ebeac6f3e96fc00a3976530225da289dc1668a7392f1356cfc5bf92a24 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325)
Oct 10 06:13:06 np0005479823 systemd[1]: var-lib-containers-storage-overlay-f3151ba89ddb65e49d6e0646af8407c3119e5a0ce05c4a2b5e5272229a03a14f-merged.mount: Deactivated successfully.
Oct 10 06:13:06 np0005479823 podman[240330]: 2025-10-10 10:13:06.424784547 +0000 UTC m=+0.055139602 container remove a58146ebeac6f3e96fc00a3976530225da289dc1668a7392f1356cfc5bf92a24 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True)
Oct 10 06:13:06 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Main process exited, code=exited, status=139/n/a
Oct 10 06:13:06 np0005479823 podman[240329]: 2025-10-10 10:13:06.469561656 +0000 UTC m=+0.075020866 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 06:13:06 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Failed with result 'exit-code'.
Oct 10 06:13:06 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.294s CPU time.
Oct 10 06:13:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:06.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:13:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:07.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:13:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:13:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:08.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:13:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:09.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:13:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:10.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/101311 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 06:13:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:11.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:12.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:13:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:13.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:13:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:13:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:14.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:13:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:13:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:15.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:16 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Scheduled restart job, restart counter is at 11.
Oct 10 06:13:16 np0005479823 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 06:13:16 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.294s CPU time.
Oct 10 06:13:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:16.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:16 np0005479823 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 06:13:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:16 np0005479823 podman[240449]: 2025-10-10 10:13:16.897486107 +0000 UTC m=+0.037370684 container create 846cd6823111ee42a70b41700b1a43ae41b27e9f805d56155411b3444ac3e4da (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Oct 10 06:13:16 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8f685c822357fb25a63d78c0de3edff79157420b24cde6f68449c7f664af3204/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Oct 10 06:13:16 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8f685c822357fb25a63d78c0de3edff79157420b24cde6f68449c7f664af3204/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 06:13:16 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8f685c822357fb25a63d78c0de3edff79157420b24cde6f68449c7f664af3204/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 06:13:16 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8f685c822357fb25a63d78c0de3edff79157420b24cde6f68449c7f664af3204/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.boccfy-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 06:13:16 np0005479823 podman[240449]: 2025-10-10 10:13:16.957077229 +0000 UTC m=+0.096961846 container init 846cd6823111ee42a70b41700b1a43ae41b27e9f805d56155411b3444ac3e4da (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Oct 10 06:13:16 np0005479823 podman[240449]: 2025-10-10 10:13:16.966939084 +0000 UTC m=+0.106823661 container start 846cd6823111ee42a70b41700b1a43ae41b27e9f805d56155411b3444ac3e4da (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 06:13:16 np0005479823 bash[240449]: 846cd6823111ee42a70b41700b1a43ae41b27e9f805d56155411b3444ac3e4da
Oct 10 06:13:16 np0005479823 podman[240449]: 2025-10-10 10:13:16.880180854 +0000 UTC m=+0.020065461 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 06:13:16 np0005479823 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 06:13:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:16 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Oct 10 06:13:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:16 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Oct 10 06:13:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:17 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Oct 10 06:13:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:17 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Oct 10 06:13:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:17 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Oct 10 06:13:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:17 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Oct 10 06:13:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:17 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Oct 10 06:13:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:17 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:13:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:17.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:18.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:19.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:13:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:20.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:13:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:21.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:13:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:22.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:23 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:13:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:23 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:13:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:13:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:23.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:13:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:13:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:24.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:13:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:24 np0005479823 podman[240539]: 2025-10-10 10:13:24.805580189 +0000 UTC m=+0.069327274 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct 10 06:13:24 np0005479823 podman[240541]: 2025-10-10 10:13:24.807255913 +0000 UTC m=+0.068797648 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid)
Oct 10 06:13:24 np0005479823 podman[240540]: 2025-10-10 10:13:24.845133522 +0000 UTC m=+0.104896320 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Oct 10 06:13:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:13:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:25.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:26.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:27.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:28.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e0000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:13:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:29.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:13:30 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 06:13:30 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:13:30 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:13:30 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 06:13:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:13:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:30.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:31 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:31 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:31 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:31.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:32.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/101333 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Oct 10 06:13:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:33 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0dc001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:33 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:33 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:33.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:34 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:13:34 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:13:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:34.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:35 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:35 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0dc0023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:35 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:13:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:13:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:35.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:13:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:13:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:36.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:13:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:36 np0005479823 podman[240737]: 2025-10-10 10:13:36.809519155 +0000 UTC m=+0.070322617 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Oct 10 06:13:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:37 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:37 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:37 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0dc0023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:37 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:13:37.855 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'da:dc:6a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '42:2f:dd:4e:d8:41'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct 10 06:13:37 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:13:37.856 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct 10 06:13:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:37.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:13:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:38.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:13:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:39 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:39 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:39 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:39.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:13:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:40.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:41 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0dc0023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:41 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:41 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:13:41.465 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:13:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:13:41.466 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:13:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:13:41.466 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:13:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:41.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:42.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:42 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:13:42.858 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:13:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:43 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:43 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0dc0034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:43 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:13:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:43.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:13:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/101344 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 06:13:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:13:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:44.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:13:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:45 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:45 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc0032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:45 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0dc0034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:13:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:45.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:46.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:47 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:47 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:47 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc0032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:13:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:47.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:13:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:48.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:49 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0dc0034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:49 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:49 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:49 np0005479823 nova_compute[235775]: 2025-10-10 10:13:49.809 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:13:49 np0005479823 nova_compute[235775]: 2025-10-10 10:13:49.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:13:49 np0005479823 nova_compute[235775]: 2025-10-10 10:13:49.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct 10 06:13:49 np0005479823 nova_compute[235775]: 2025-10-10 10:13:49.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct 10 06:13:49 np0005479823 nova_compute[235775]: 2025-10-10 10:13:49.830 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct 10 06:13:49 np0005479823 nova_compute[235775]: 2025-10-10 10:13:49.830 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:13:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:49.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:13:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:13:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:50.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:13:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:50 np0005479823 nova_compute[235775]: 2025-10-10 10:13:50.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:13:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:51 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:51 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0dc0034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:51 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:52.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:52.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:52 np0005479823 nova_compute[235775]: 2025-10-10 10:13:52.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:13:52 np0005479823 nova_compute[235775]: 2025-10-10 10:13:52.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:13:52 np0005479823 nova_compute[235775]: 2025-10-10 10:13:52.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:13:52 np0005479823 nova_compute[235775]: 2025-10-10 10:13:52.854 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:13:52 np0005479823 nova_compute[235775]: 2025-10-10 10:13:52.855 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:13:52 np0005479823 nova_compute[235775]: 2025-10-10 10:13:52.855 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:13:52 np0005479823 nova_compute[235775]: 2025-10-10 10:13:52.855 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct 10 06:13:52 np0005479823 nova_compute[235775]: 2025-10-10 10:13:52.855 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:13:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:53 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:53 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:53 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:13:53 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2920225613' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:13:53 np0005479823 nova_compute[235775]: 2025-10-10 10:13:53.280 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:13:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:53 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0dc0034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:53 np0005479823 nova_compute[235775]: 2025-10-10 10:13:53.463 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct 10 06:13:53 np0005479823 nova_compute[235775]: 2025-10-10 10:13:53.464 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5195MB free_disk=59.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct 10 06:13:53 np0005479823 nova_compute[235775]: 2025-10-10 10:13:53.464 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:13:53 np0005479823 nova_compute[235775]: 2025-10-10 10:13:53.464 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:13:53 np0005479823 nova_compute[235775]: 2025-10-10 10:13:53.568 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct 10 06:13:53 np0005479823 nova_compute[235775]: 2025-10-10 10:13:53.568 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct 10 06:13:53 np0005479823 nova_compute[235775]: 2025-10-10 10:13:53.584 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:13:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:54.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:54 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:13:54 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3365250461' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:13:54 np0005479823 nova_compute[235775]: 2025-10-10 10:13:54.048 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:13:54 np0005479823 nova_compute[235775]: 2025-10-10 10:13:54.053 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct 10 06:13:54 np0005479823 nova_compute[235775]: 2025-10-10 10:13:54.068 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct 10 06:13:54 np0005479823 nova_compute[235775]: 2025-10-10 10:13:54.069 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct 10 06:13:54 np0005479823 nova_compute[235775]: 2025-10-10 10:13:54.069 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.605s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:13:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:54.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:55 np0005479823 nova_compute[235775]: 2025-10-10 10:13:55.065 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:13:55 np0005479823 nova_compute[235775]: 2025-10-10 10:13:55.080 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:13:55 np0005479823 nova_compute[235775]: 2025-10-10 10:13:55.081 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:13:55 np0005479823 nova_compute[235775]: 2025-10-10 10:13:55.081 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct 10 06:13:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:55 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:55 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0dc0034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:55 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:13:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:55 np0005479823 podman[240844]: 2025-10-10 10:13:55.786638953 +0000 UTC m=+0.062505351 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251001)
Oct 10 06:13:55 np0005479823 podman[240846]: 2025-10-10 10:13:55.801653897 +0000 UTC m=+0.064363641 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct 10 06:13:55 np0005479823 podman[240845]: 2025-10-10 10:13:55.81855471 +0000 UTC m=+0.087283798 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 10 06:13:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:56.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:56.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:57 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:57 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:57 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0dc0034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:13:58.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:13:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:13:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:13:58.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:13:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:59 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:59 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0bc004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:13:59 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:13:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:13:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:13:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:13:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:00.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:14:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:00.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:01 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0dc0034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:01 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:01 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b0000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:02.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:02.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:03 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:03 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e0000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:03 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:14:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:04.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:14:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:04.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:05 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:05 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:05 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e0000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:14:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:06.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:06.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:07 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:07 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:07 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:07 np0005479823 podman[240948]: 2025-10-10 10:14:07.784576386 +0000 UTC m=+0.057779500 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Oct 10 06:14:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:14:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:08.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:14:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:08.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:09 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e0000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:09 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:09 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:14:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:10.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:14:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:14:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:10.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:11 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:11 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e0008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:11 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:12.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:12.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:13 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b0002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/101413 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 06:14:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:13 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:13 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e0008dc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:14.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:14.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:15 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:15 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b0002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:15 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:14:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:14:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:16.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:14:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:16.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:17 np0005479823 nova_compute[235775]: 2025-10-10 10:14:17.102 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:14:17 np0005479823 nova_compute[235775]: 2025-10-10 10:14:17.103 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:14:17 np0005479823 nova_compute[235775]: 2025-10-10 10:14:17.119 2 DEBUG nova.compute.manager [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct 10 06:14:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:17 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e0009ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:17 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:17 np0005479823 nova_compute[235775]: 2025-10-10 10:14:17.206 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:14:17 np0005479823 nova_compute[235775]: 2025-10-10 10:14:17.206 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:14:17 np0005479823 nova_compute[235775]: 2025-10-10 10:14:17.213 2 DEBUG nova.virt.hardware [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct 10 06:14:17 np0005479823 nova_compute[235775]: 2025-10-10 10:14:17.213 2 INFO nova.compute.claims [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct 10 06:14:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:17 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:17 np0005479823 nova_compute[235775]: 2025-10-10 10:14:17.330 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:14:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:17 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:14:17 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3515183494' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:14:17 np0005479823 nova_compute[235775]: 2025-10-10 10:14:17.837 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:14:17 np0005479823 nova_compute[235775]: 2025-10-10 10:14:17.844 2 DEBUG nova.compute.provider_tree [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct 10 06:14:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:14:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:18.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:14:18 np0005479823 nova_compute[235775]: 2025-10-10 10:14:18.095 2 DEBUG nova.scheduler.client.report [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct 10 06:14:18 np0005479823 nova_compute[235775]: 2025-10-10 10:14:18.116 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.910s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:14:18 np0005479823 nova_compute[235775]: 2025-10-10 10:14:18.117 2 DEBUG nova.compute.manager [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct 10 06:14:18 np0005479823 nova_compute[235775]: 2025-10-10 10:14:18.158 2 DEBUG nova.compute.manager [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct 10 06:14:18 np0005479823 nova_compute[235775]: 2025-10-10 10:14:18.158 2 DEBUG nova.network.neutron [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct 10 06:14:18 np0005479823 nova_compute[235775]: 2025-10-10 10:14:18.189 2 INFO nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct 10 06:14:18 np0005479823 nova_compute[235775]: 2025-10-10 10:14:18.209 2 DEBUG nova.compute.manager [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct 10 06:14:18 np0005479823 nova_compute[235775]: 2025-10-10 10:14:18.305 2 DEBUG nova.compute.manager [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct 10 06:14:18 np0005479823 nova_compute[235775]: 2025-10-10 10:14:18.307 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct 10 06:14:18 np0005479823 nova_compute[235775]: 2025-10-10 10:14:18.308 2 INFO nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Creating image(s)#033[00m
Oct 10 06:14:18 np0005479823 nova_compute[235775]: 2025-10-10 10:14:18.347 2 DEBUG nova.storage.rbd_utils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image f6ec6baf-a91e-4c7e-b1cf-b176d952068f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct 10 06:14:18 np0005479823 nova_compute[235775]: 2025-10-10 10:14:18.386 2 DEBUG nova.storage.rbd_utils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image f6ec6baf-a91e-4c7e-b1cf-b176d952068f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct 10 06:14:18 np0005479823 nova_compute[235775]: 2025-10-10 10:14:18.416 2 DEBUG nova.storage.rbd_utils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image f6ec6baf-a91e-4c7e-b1cf-b176d952068f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct 10 06:14:18 np0005479823 nova_compute[235775]: 2025-10-10 10:14:18.420 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "eec5fe2328f977d3b1a385313e521aef425c0ac1" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:14:18 np0005479823 nova_compute[235775]: 2025-10-10 10:14:18.421 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "eec5fe2328f977d3b1a385313e521aef425c0ac1" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:14:18 np0005479823 nova_compute[235775]: 2025-10-10 10:14:18.694 2 DEBUG nova.virt.libvirt.imagebackend [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Image locations are: [{'url': 'rbd://21f084a3-af34-5230-afe4-ea5cd24a55f4/images/5ae78700-970d-45b4-a57d-978a054c7519/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://21f084a3-af34-5230-afe4-ea5cd24a55f4/images/5ae78700-970d-45b4-a57d-978a054c7519/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Oct 10 06:14:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:18.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:18 np0005479823 nova_compute[235775]: 2025-10-10 10:14:18.811 2 WARNING oslo_policy.policy [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Oct 10 06:14:18 np0005479823 nova_compute[235775]: 2025-10-10 10:14:18.811 2 WARNING oslo_policy.policy [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Oct 10 06:14:18 np0005479823 nova_compute[235775]: 2025-10-10 10:14:18.816 2 DEBUG nova.policy [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7956778c03764aaf8906c9b435337976', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd5e531d4b440422d946eaf6fd4e166f7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct 10 06:14:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:19 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:19 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e0009ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:19 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:19 np0005479823 nova_compute[235775]: 2025-10-10 10:14:19.491 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:14:19 np0005479823 nova_compute[235775]: 2025-10-10 10:14:19.561 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1.part --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:14:19 np0005479823 nova_compute[235775]: 2025-10-10 10:14:19.563 2 DEBUG nova.virt.images [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] 5ae78700-970d-45b4-a57d-978a054c7519 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Oct 10 06:14:19 np0005479823 nova_compute[235775]: 2025-10-10 10:14:19.565 2 DEBUG nova.privsep.utils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct 10 06:14:19 np0005479823 nova_compute[235775]: 2025-10-10 10:14:19.566 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1.part /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:14:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:19 np0005479823 nova_compute[235775]: 2025-10-10 10:14:19.788 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1.part /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1.converted" returned: 0 in 0.223s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:14:19 np0005479823 nova_compute[235775]: 2025-10-10 10:14:19.795 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:14:19 np0005479823 nova_compute[235775]: 2025-10-10 10:14:19.862 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1.converted --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:14:19 np0005479823 nova_compute[235775]: 2025-10-10 10:14:19.863 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "eec5fe2328f977d3b1a385313e521aef425c0ac1" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.443s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:14:19 np0005479823 nova_compute[235775]: 2025-10-10 10:14:19.888 2 DEBUG nova.storage.rbd_utils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image f6ec6baf-a91e-4c7e-b1cf-b176d952068f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct 10 06:14:19 np0005479823 nova_compute[235775]: 2025-10-10 10:14:19.891 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1 f6ec6baf-a91e-4c7e-b1cf-b176d952068f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:14:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:14:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:20.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:14:20 np0005479823 nova_compute[235775]: 2025-10-10 10:14:20.045 2 DEBUG nova.network.neutron [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Successfully created port: be812d6f-78ad-4f90-9cd0-0ae2444e7f71 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct 10 06:14:20 np0005479823 nova_compute[235775]: 2025-10-10 10:14:20.185 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1 f6ec6baf-a91e-4c7e-b1cf-b176d952068f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.294s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:14:20 np0005479823 nova_compute[235775]: 2025-10-10 10:14:20.270 2 DEBUG nova.storage.rbd_utils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] resizing rbd image f6ec6baf-a91e-4c7e-b1cf-b176d952068f_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct 10 06:14:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:14:20 np0005479823 nova_compute[235775]: 2025-10-10 10:14:20.410 2 DEBUG nova.objects.instance [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lazy-loading 'migration_context' on Instance uuid f6ec6baf-a91e-4c7e-b1cf-b176d952068f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct 10 06:14:20 np0005479823 nova_compute[235775]: 2025-10-10 10:14:20.423 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct 10 06:14:20 np0005479823 nova_compute[235775]: 2025-10-10 10:14:20.424 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Ensure instance console log exists: /var/lib/nova/instances/f6ec6baf-a91e-4c7e-b1cf-b176d952068f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct 10 06:14:20 np0005479823 nova_compute[235775]: 2025-10-10 10:14:20.424 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:14:20 np0005479823 nova_compute[235775]: 2025-10-10 10:14:20.425 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:14:20 np0005479823 nova_compute[235775]: 2025-10-10 10:14:20.425 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:14:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:20.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:21 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:21 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:21 np0005479823 nova_compute[235775]: 2025-10-10 10:14:21.292 2 DEBUG nova.network.neutron [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Successfully updated port: be812d6f-78ad-4f90-9cd0-0ae2444e7f71 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct 10 06:14:21 np0005479823 nova_compute[235775]: 2025-10-10 10:14:21.312 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "refresh_cache-f6ec6baf-a91e-4c7e-b1cf-b176d952068f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct 10 06:14:21 np0005479823 nova_compute[235775]: 2025-10-10 10:14:21.313 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquired lock "refresh_cache-f6ec6baf-a91e-4c7e-b1cf-b176d952068f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct 10 06:14:21 np0005479823 nova_compute[235775]: 2025-10-10 10:14:21.313 2 DEBUG nova.network.neutron [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct 10 06:14:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:21 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e000a7e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:21 np0005479823 nova_compute[235775]: 2025-10-10 10:14:21.429 2 DEBUG nova.compute.manager [req-3be46812-d39d-473c-8d79-50768ff11da9 req-8e2cd78f-5569-4b78-98ce-f656110b4c1b 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Received event network-changed-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct 10 06:14:21 np0005479823 nova_compute[235775]: 2025-10-10 10:14:21.429 2 DEBUG nova.compute.manager [req-3be46812-d39d-473c-8d79-50768ff11da9 req-8e2cd78f-5569-4b78-98ce-f656110b4c1b 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Refreshing instance network info cache due to event network-changed-be812d6f-78ad-4f90-9cd0-0ae2444e7f71. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct 10 06:14:21 np0005479823 nova_compute[235775]: 2025-10-10 10:14:21.430 2 DEBUG oslo_concurrency.lockutils [req-3be46812-d39d-473c-8d79-50768ff11da9 req-8e2cd78f-5569-4b78-98ce-f656110b4c1b 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "refresh_cache-f6ec6baf-a91e-4c7e-b1cf-b176d952068f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct 10 06:14:21 np0005479823 nova_compute[235775]: 2025-10-10 10:14:21.497 2 DEBUG nova.network.neutron [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct 10 06:14:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:14:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:22.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:14:22 np0005479823 nova_compute[235775]: 2025-10-10 10:14:22.347 2 DEBUG nova.network.neutron [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Updating instance_info_cache with network_info: [{"id": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "address": "fa:16:3e:35:91:37", "network": {"id": "c8850c4c-dc38-4440-9c03-f2dd59684fe6", "bridge": "br-int", "label": "tempest-network-smoke--781461532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe812d6f-78", "ovs_interfaceid": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct 10 06:14:22 np0005479823 nova_compute[235775]: 2025-10-10 10:14:22.366 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Releasing lock "refresh_cache-f6ec6baf-a91e-4c7e-b1cf-b176d952068f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct 10 06:14:22 np0005479823 nova_compute[235775]: 2025-10-10 10:14:22.367 2 DEBUG nova.compute.manager [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Instance network_info: |[{"id": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "address": "fa:16:3e:35:91:37", "network": {"id": "c8850c4c-dc38-4440-9c03-f2dd59684fe6", "bridge": "br-int", "label": "tempest-network-smoke--781461532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe812d6f-78", "ovs_interfaceid": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct 10 06:14:22 np0005479823 nova_compute[235775]: 2025-10-10 10:14:22.367 2 DEBUG oslo_concurrency.lockutils [req-3be46812-d39d-473c-8d79-50768ff11da9 req-8e2cd78f-5569-4b78-98ce-f656110b4c1b 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquired lock "refresh_cache-f6ec6baf-a91e-4c7e-b1cf-b176d952068f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct 10 06:14:22 np0005479823 nova_compute[235775]: 2025-10-10 10:14:22.367 2 DEBUG nova.network.neutron [req-3be46812-d39d-473c-8d79-50768ff11da9 req-8e2cd78f-5569-4b78-98ce-f656110b4c1b 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Refreshing network info cache for port be812d6f-78ad-4f90-9cd0-0ae2444e7f71 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct 10 06:14:22 np0005479823 nova_compute[235775]: 2025-10-10 10:14:22.370 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Start _get_guest_xml network_info=[{"id": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "address": "fa:16:3e:35:91:37", "network": {"id": "c8850c4c-dc38-4440-9c03-f2dd59684fe6", "bridge": "br-int", "label": "tempest-network-smoke--781461532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe812d6f-78", "ovs_interfaceid": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-10T10:09:50Z,direct_url=<?>,disk_format='qcow2',id=5ae78700-970d-45b4-a57d-978a054c7519,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ec962e275689437d80680ff3ea69c852',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-10T10:09:53Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'size': 0, 'device_type': 'disk', 'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'encryption_options': None, 'image_id': '5ae78700-970d-45b4-a57d-978a054c7519'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct 10 06:14:22 np0005479823 nova_compute[235775]: 2025-10-10 10:14:22.374 2 WARNING nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct 10 06:14:22 np0005479823 nova_compute[235775]: 2025-10-10 10:14:22.378 2 DEBUG nova.virt.libvirt.host [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct 10 06:14:22 np0005479823 nova_compute[235775]: 2025-10-10 10:14:22.379 2 DEBUG nova.virt.libvirt.host [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct 10 06:14:22 np0005479823 nova_compute[235775]: 2025-10-10 10:14:22.383 2 DEBUG nova.virt.libvirt.host [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct 10 06:14:22 np0005479823 nova_compute[235775]: 2025-10-10 10:14:22.383 2 DEBUG nova.virt.libvirt.host [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct 10 06:14:22 np0005479823 nova_compute[235775]: 2025-10-10 10:14:22.384 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct 10 06:14:22 np0005479823 nova_compute[235775]: 2025-10-10 10:14:22.384 2 DEBUG nova.virt.hardware [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-10T10:09:48Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='00373e71-6208-4238-ad85-db0452c53bc6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-10T10:09:50Z,direct_url=<?>,disk_format='qcow2',id=5ae78700-970d-45b4-a57d-978a054c7519,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ec962e275689437d80680ff3ea69c852',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-10T10:09:53Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct 10 06:14:22 np0005479823 nova_compute[235775]: 2025-10-10 10:14:22.385 2 DEBUG nova.virt.hardware [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct 10 06:14:22 np0005479823 nova_compute[235775]: 2025-10-10 10:14:22.385 2 DEBUG nova.virt.hardware [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct 10 06:14:22 np0005479823 nova_compute[235775]: 2025-10-10 10:14:22.385 2 DEBUG nova.virt.hardware [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct 10 06:14:22 np0005479823 nova_compute[235775]: 2025-10-10 10:14:22.385 2 DEBUG nova.virt.hardware [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct 10 06:14:22 np0005479823 nova_compute[235775]: 2025-10-10 10:14:22.385 2 DEBUG nova.virt.hardware [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct 10 06:14:22 np0005479823 nova_compute[235775]: 2025-10-10 10:14:22.386 2 DEBUG nova.virt.hardware [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct 10 06:14:22 np0005479823 nova_compute[235775]: 2025-10-10 10:14:22.386 2 DEBUG nova.virt.hardware [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct 10 06:14:22 np0005479823 nova_compute[235775]: 2025-10-10 10:14:22.386 2 DEBUG nova.virt.hardware [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct 10 06:14:22 np0005479823 nova_compute[235775]: 2025-10-10 10:14:22.386 2 DEBUG nova.virt.hardware [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct 10 06:14:22 np0005479823 nova_compute[235775]: 2025-10-10 10:14:22.387 2 DEBUG nova.virt.hardware [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct 10 06:14:22 np0005479823 nova_compute[235775]: 2025-10-10 10:14:22.390 2 DEBUG nova.privsep.utils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct 10 06:14:22 np0005479823 nova_compute[235775]: 2025-10-10 10:14:22.390 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:14:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:22.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:22 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Oct 10 06:14:22 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1953104862' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 06:14:22 np0005479823 nova_compute[235775]: 2025-10-10 10:14:22.828 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:14:22 np0005479823 nova_compute[235775]: 2025-10-10 10:14:22.866 2 DEBUG nova.storage.rbd_utils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image f6ec6baf-a91e-4c7e-b1cf-b176d952068f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct 10 06:14:22 np0005479823 nova_compute[235775]: 2025-10-10 10:14:22.870 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:14:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:23 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e000a7e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:23 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:23 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Oct 10 06:14:23 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1711456005' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 06:14:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:23 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.297 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.299 2 DEBUG nova.virt.libvirt.vif [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-10T10:14:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1362038391',display_name='tempest-TestNetworkBasicOps-server-1362038391',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1362038391',id=5,image_ref='5ae78700-970d-45b4-a57d-978a054c7519',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOaO/Dm5TZJdJA+p0WorpE1s/wHDKiboiIskSllf2vhdjUj1oz81caVPGQVtZrwI+VVMAczLEmtRNwhb15+QK4so2BghvGEI3ChmYsvOZuU3tzU+nN+IQyotPE2q48Vw5A==',key_name='tempest-TestNetworkBasicOps-804562104',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d5e531d4b440422d946eaf6fd4e166f7',ramdisk_id='',reservation_id='r-ksfjfy6b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5ae78700-970d-45b4-a57d-978a054c7519',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-188749107',owner_user_name='tempest-TestNetworkBasicOps-188749107-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-10T10:14:18Z,user_data=None,user_id='7956778c03764aaf8906c9b435337976',uuid=f6ec6baf-a91e-4c7e-b1cf-b176d952068f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "address": "fa:16:3e:35:91:37", "network": {"id": "c8850c4c-dc38-4440-9c03-f2dd59684fe6", "bridge": "br-int", "label": "tempest-network-smoke--781461532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe812d6f-78", "ovs_interfaceid": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.299 2 DEBUG nova.network.os_vif_util [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converting VIF {"id": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "address": "fa:16:3e:35:91:37", "network": {"id": "c8850c4c-dc38-4440-9c03-f2dd59684fe6", "bridge": "br-int", "label": "tempest-network-smoke--781461532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe812d6f-78", "ovs_interfaceid": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.301 2 DEBUG nova.network.os_vif_util [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:91:37,bridge_name='br-int',has_traffic_filtering=True,id=be812d6f-78ad-4f90-9cd0-0ae2444e7f71,network=Network(c8850c4c-dc38-4440-9c03-f2dd59684fe6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe812d6f-78') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.304 2 DEBUG nova.objects.instance [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lazy-loading 'pci_devices' on Instance uuid f6ec6baf-a91e-4c7e-b1cf-b176d952068f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct 10 06:14:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:23 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.330 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] End _get_guest_xml xml=<domain type="kvm">
Oct 10 06:14:23 np0005479823 nova_compute[235775]:  <uuid>f6ec6baf-a91e-4c7e-b1cf-b176d952068f</uuid>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:  <name>instance-00000005</name>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:  <memory>131072</memory>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:  <vcpu>1</vcpu>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:  <metadata>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <nova:name>tempest-TestNetworkBasicOps-server-1362038391</nova:name>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <nova:creationTime>2025-10-10 10:14:22</nova:creationTime>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <nova:flavor name="m1.nano">
Oct 10 06:14:23 np0005479823 nova_compute[235775]:        <nova:memory>128</nova:memory>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:        <nova:disk>1</nova:disk>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:        <nova:swap>0</nova:swap>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:        <nova:ephemeral>0</nova:ephemeral>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:        <nova:vcpus>1</nova:vcpus>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      </nova:flavor>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <nova:owner>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:        <nova:user uuid="7956778c03764aaf8906c9b435337976">tempest-TestNetworkBasicOps-188749107-project-member</nova:user>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:        <nova:project uuid="d5e531d4b440422d946eaf6fd4e166f7">tempest-TestNetworkBasicOps-188749107</nova:project>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      </nova:owner>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <nova:root type="image" uuid="5ae78700-970d-45b4-a57d-978a054c7519"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <nova:ports>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:        <nova:port uuid="be812d6f-78ad-4f90-9cd0-0ae2444e7f71">
Oct 10 06:14:23 np0005479823 nova_compute[235775]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:        </nova:port>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      </nova:ports>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    </nova:instance>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:  </metadata>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:  <sysinfo type="smbios">
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <system>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <entry name="manufacturer">RDO</entry>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <entry name="product">OpenStack Compute</entry>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <entry name="serial">f6ec6baf-a91e-4c7e-b1cf-b176d952068f</entry>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <entry name="uuid">f6ec6baf-a91e-4c7e-b1cf-b176d952068f</entry>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <entry name="family">Virtual Machine</entry>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    </system>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:  </sysinfo>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:  <os>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <type arch="x86_64" machine="q35">hvm</type>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <boot dev="hd"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <smbios mode="sysinfo"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:  </os>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:  <features>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <acpi/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <apic/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <vmcoreinfo/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:  </features>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:  <clock offset="utc">
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <timer name="pit" tickpolicy="delay"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <timer name="rtc" tickpolicy="catchup"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <timer name="hpet" present="no"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:  </clock>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:  <cpu mode="host-model" match="exact">
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <topology sockets="1" cores="1" threads="1"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:  </cpu>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:  <devices>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <disk type="network" device="disk">
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <driver type="raw" cache="none"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <source protocol="rbd" name="vms/f6ec6baf-a91e-4c7e-b1cf-b176d952068f_disk">
Oct 10 06:14:23 np0005479823 nova_compute[235775]:        <host name="192.168.122.100" port="6789"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:        <host name="192.168.122.102" port="6789"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:        <host name="192.168.122.101" port="6789"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      </source>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <auth username="openstack">
Oct 10 06:14:23 np0005479823 nova_compute[235775]:        <secret type="ceph" uuid="21f084a3-af34-5230-afe4-ea5cd24a55f4"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      </auth>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <target dev="vda" bus="virtio"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    </disk>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <disk type="network" device="cdrom">
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <driver type="raw" cache="none"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <source protocol="rbd" name="vms/f6ec6baf-a91e-4c7e-b1cf-b176d952068f_disk.config">
Oct 10 06:14:23 np0005479823 nova_compute[235775]:        <host name="192.168.122.100" port="6789"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:        <host name="192.168.122.102" port="6789"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:        <host name="192.168.122.101" port="6789"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      </source>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <auth username="openstack">
Oct 10 06:14:23 np0005479823 nova_compute[235775]:        <secret type="ceph" uuid="21f084a3-af34-5230-afe4-ea5cd24a55f4"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      </auth>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <target dev="sda" bus="sata"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    </disk>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <interface type="ethernet">
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <mac address="fa:16:3e:35:91:37"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <model type="virtio"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <driver name="vhost" rx_queue_size="512"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <mtu size="1442"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <target dev="tapbe812d6f-78"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    </interface>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <serial type="pty">
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <log file="/var/lib/nova/instances/f6ec6baf-a91e-4c7e-b1cf-b176d952068f/console.log" append="off"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    </serial>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <video>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <model type="virtio"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    </video>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <input type="tablet" bus="usb"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <rng model="virtio">
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <backend model="random">/dev/urandom</backend>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    </rng>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <controller type="usb" index="0"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    <memballoon model="virtio">
Oct 10 06:14:23 np0005479823 nova_compute[235775]:      <stats period="10"/>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:    </memballoon>
Oct 10 06:14:23 np0005479823 nova_compute[235775]:  </devices>
Oct 10 06:14:23 np0005479823 nova_compute[235775]: </domain>
Oct 10 06:14:23 np0005479823 nova_compute[235775]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.333 2 DEBUG nova.compute.manager [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Preparing to wait for external event network-vif-plugged-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.334 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.335 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.335 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.336 2 DEBUG nova.virt.libvirt.vif [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-10T10:14:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1362038391',display_name='tempest-TestNetworkBasicOps-server-1362038391',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1362038391',id=5,image_ref='5ae78700-970d-45b4-a57d-978a054c7519',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOaO/Dm5TZJdJA+p0WorpE1s/wHDKiboiIskSllf2vhdjUj1oz81caVPGQVtZrwI+VVMAczLEmtRNwhb15+QK4so2BghvGEI3ChmYsvOZuU3tzU+nN+IQyotPE2q48Vw5A==',key_name='tempest-TestNetworkBasicOps-804562104',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d5e531d4b440422d946eaf6fd4e166f7',ramdisk_id='',reservation_id='r-ksfjfy6b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5ae78700-970d-45b4-a57d-978a054c7519',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-188749107',owner_user_name='tempest-TestNetworkBasicOps-188749107-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-10T10:14:18Z,user_data=None,user_id='7956778c03764aaf8906c9b435337976',uuid=f6ec6baf-a91e-4c7e-b1cf-b176d952068f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "address": "fa:16:3e:35:91:37", "network": {"id": "c8850c4c-dc38-4440-9c03-f2dd59684fe6", "bridge": "br-int", "label": "tempest-network-smoke--781461532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe812d6f-78", "ovs_interfaceid": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.336 2 DEBUG nova.network.os_vif_util [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converting VIF {"id": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "address": "fa:16:3e:35:91:37", "network": {"id": "c8850c4c-dc38-4440-9c03-f2dd59684fe6", "bridge": "br-int", "label": "tempest-network-smoke--781461532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe812d6f-78", "ovs_interfaceid": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.337 2 DEBUG nova.network.os_vif_util [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:91:37,bridge_name='br-int',has_traffic_filtering=True,id=be812d6f-78ad-4f90-9cd0-0ae2444e7f71,network=Network(c8850c4c-dc38-4440-9c03-f2dd59684fe6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe812d6f-78') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.338 2 DEBUG os_vif [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:91:37,bridge_name='br-int',has_traffic_filtering=True,id=be812d6f-78ad-4f90-9cd0-0ae2444e7f71,network=Network(c8850c4c-dc38-4440-9c03-f2dd59684fe6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe812d6f-78') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.383 2 DEBUG nova.network.neutron [req-3be46812-d39d-473c-8d79-50768ff11da9 req-8e2cd78f-5569-4b78-98ce-f656110b4c1b 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Updated VIF entry in instance network info cache for port be812d6f-78ad-4f90-9cd0-0ae2444e7f71. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.383 2 DEBUG nova.network.neutron [req-3be46812-d39d-473c-8d79-50768ff11da9 req-8e2cd78f-5569-4b78-98ce-f656110b4c1b 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Updating instance_info_cache with network_info: [{"id": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "address": "fa:16:3e:35:91:37", "network": {"id": "c8850c4c-dc38-4440-9c03-f2dd59684fe6", "bridge": "br-int", "label": "tempest-network-smoke--781461532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe812d6f-78", "ovs_interfaceid": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.392 2 DEBUG ovsdbapp.backend.ovs_idl [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.393 2 DEBUG ovsdbapp.backend.ovs_idl [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.393 2 DEBUG ovsdbapp.backend.ovs_idl [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [POLLOUT] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.398 2 DEBUG oslo_concurrency.lockutils [req-3be46812-d39d-473c-8d79-50768ff11da9 req-8e2cd78f-5569-4b78-98ce-f656110b4c1b 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Releasing lock "refresh_cache-f6ec6baf-a91e-4c7e-b1cf-b176d952068f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.409 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.409 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct 10 06:14:23 np0005479823 nova_compute[235775]: 2025-10-10 10:14:23.410 2 INFO oslo.privsep.daemon [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpd1vgorxc/privsep.sock']#033[00m
Oct 10 06:14:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:24.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:24 np0005479823 nova_compute[235775]: 2025-10-10 10:14:24.122 2 INFO oslo.privsep.daemon [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Oct 10 06:14:24 np0005479823 nova_compute[235775]: 2025-10-10 10:14:24.021 697 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct 10 06:14:24 np0005479823 nova_compute[235775]: 2025-10-10 10:14:24.029 697 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct 10 06:14:24 np0005479823 nova_compute[235775]: 2025-10-10 10:14:24.033 697 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Oct 10 06:14:24 np0005479823 nova_compute[235775]: 2025-10-10 10:14:24.034 697 INFO oslo.privsep.daemon [-] privsep daemon running as pid 697#033[00m
Oct 10 06:14:24 np0005479823 nova_compute[235775]: 2025-10-10 10:14:24.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:24 np0005479823 nova_compute[235775]: 2025-10-10 10:14:24.512 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbe812d6f-78, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:14:24 np0005479823 nova_compute[235775]: 2025-10-10 10:14:24.513 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbe812d6f-78, col_values=(('external_ids', {'iface-id': 'be812d6f-78ad-4f90-9cd0-0ae2444e7f71', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:35:91:37', 'vm-uuid': 'f6ec6baf-a91e-4c7e-b1cf-b176d952068f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:14:24 np0005479823 nova_compute[235775]: 2025-10-10 10:14:24.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:24 np0005479823 NetworkManager[44866]: <info>  [1760091264.5163] manager: (tapbe812d6f-78): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Oct 10 06:14:24 np0005479823 nova_compute[235775]: 2025-10-10 10:14:24.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct 10 06:14:24 np0005479823 nova_compute[235775]: 2025-10-10 10:14:24.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:24 np0005479823 nova_compute[235775]: 2025-10-10 10:14:24.524 2 INFO os_vif [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:91:37,bridge_name='br-int',has_traffic_filtering=True,id=be812d6f-78ad-4f90-9cd0-0ae2444e7f71,network=Network(c8850c4c-dc38-4440-9c03-f2dd59684fe6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe812d6f-78')#033[00m
Oct 10 06:14:24 np0005479823 nova_compute[235775]: 2025-10-10 10:14:24.573 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct 10 06:14:24 np0005479823 nova_compute[235775]: 2025-10-10 10:14:24.574 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct 10 06:14:24 np0005479823 nova_compute[235775]: 2025-10-10 10:14:24.574 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] No VIF found with MAC fa:16:3e:35:91:37, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct 10 06:14:24 np0005479823 nova_compute[235775]: 2025-10-10 10:14:24.574 2 INFO nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Using config drive#033[00m
Oct 10 06:14:24 np0005479823 nova_compute[235775]: 2025-10-10 10:14:24.604 2 DEBUG nova.storage.rbd_utils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image f6ec6baf-a91e-4c7e-b1cf-b176d952068f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct 10 06:14:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:24.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:25 np0005479823 nova_compute[235775]: 2025-10-10 10:14:25.018 2 INFO nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Creating config drive at /var/lib/nova/instances/f6ec6baf-a91e-4c7e-b1cf-b176d952068f/disk.config#033[00m
Oct 10 06:14:25 np0005479823 nova_compute[235775]: 2025-10-10 10:14:25.030 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f6ec6baf-a91e-4c7e-b1cf-b176d952068f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj48e1hdh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:14:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:25 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e000a7e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:25 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:25 np0005479823 nova_compute[235775]: 2025-10-10 10:14:25.180 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f6ec6baf-a91e-4c7e-b1cf-b176d952068f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj48e1hdh" returned: 0 in 0.150s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:14:25 np0005479823 nova_compute[235775]: 2025-10-10 10:14:25.212 2 DEBUG nova.storage.rbd_utils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image f6ec6baf-a91e-4c7e-b1cf-b176d952068f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct 10 06:14:25 np0005479823 nova_compute[235775]: 2025-10-10 10:14:25.215 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/f6ec6baf-a91e-4c7e-b1cf-b176d952068f/disk.config f6ec6baf-a91e-4c7e-b1cf-b176d952068f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:14:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:14:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:25 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:25 np0005479823 nova_compute[235775]: 2025-10-10 10:14:25.354 2 DEBUG oslo_concurrency.processutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/f6ec6baf-a91e-4c7e-b1cf-b176d952068f/disk.config f6ec6baf-a91e-4c7e-b1cf-b176d952068f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:14:25 np0005479823 nova_compute[235775]: 2025-10-10 10:14:25.356 2 INFO nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Deleting local config drive /var/lib/nova/instances/f6ec6baf-a91e-4c7e-b1cf-b176d952068f/disk.config because it was imported into RBD.#033[00m
Oct 10 06:14:25 np0005479823 systemd[1]: Starting libvirt secret daemon...
Oct 10 06:14:25 np0005479823 systemd[1]: Started libvirt secret daemon.
Oct 10 06:14:25 np0005479823 kernel: tun: Universal TUN/TAP device driver, 1.6
Oct 10 06:14:25 np0005479823 kernel: tapbe812d6f-78: entered promiscuous mode
Oct 10 06:14:25 np0005479823 NetworkManager[44866]: <info>  [1760091265.4758] manager: (tapbe812d6f-78): new Tun device (/org/freedesktop/NetworkManager/Devices/26)
Oct 10 06:14:25 np0005479823 ovn_controller[132503]: 2025-10-10T10:14:25Z|00027|binding|INFO|Claiming lport be812d6f-78ad-4f90-9cd0-0ae2444e7f71 for this chassis.
Oct 10 06:14:25 np0005479823 nova_compute[235775]: 2025-10-10 10:14:25.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:25 np0005479823 ovn_controller[132503]: 2025-10-10T10:14:25Z|00028|binding|INFO|be812d6f-78ad-4f90-9cd0-0ae2444e7f71: Claiming fa:16:3e:35:91:37 10.100.0.11
Oct 10 06:14:25 np0005479823 nova_compute[235775]: 2025-10-10 10:14:25.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:25 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:25.497 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:91:37 10.100.0.11'], port_security=['fa:16:3e:35:91:37 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'f6ec6baf-a91e-4c7e-b1cf-b176d952068f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c8850c4c-dc38-4440-9c03-f2dd59684fe6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5e531d4b440422d946eaf6fd4e166f7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2502283d-b38d-456e-8e7f-133a87baf32b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=21e2152f-e965-46e3-9774-988f8fdf189b, chassis=[<ovs.db.idl.Row object at 0x7fa4a23cd910>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa4a23cd910>], logical_port=be812d6f-78ad-4f90-9cd0-0ae2444e7f71) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct 10 06:14:25 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:25.499 141795 INFO neutron.agent.ovn.metadata.agent [-] Port be812d6f-78ad-4f90-9cd0-0ae2444e7f71 in datapath c8850c4c-dc38-4440-9c03-f2dd59684fe6 bound to our chassis#033[00m
Oct 10 06:14:25 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:25.502 141795 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c8850c4c-dc38-4440-9c03-f2dd59684fe6#033[00m
Oct 10 06:14:25 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:25.505 141795 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmp5qd_lucz/privsep.sock']#033[00m
Oct 10 06:14:25 np0005479823 systemd-machined[192768]: New machine qemu-1-instance-00000005.
Oct 10 06:14:25 np0005479823 systemd-udevd[241382]: Network interface NamePolicy= disabled on kernel command line.
Oct 10 06:14:25 np0005479823 NetworkManager[44866]: <info>  [1760091265.5612] device (tapbe812d6f-78): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 10 06:14:25 np0005479823 NetworkManager[44866]: <info>  [1760091265.5626] device (tapbe812d6f-78): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 10 06:14:25 np0005479823 systemd[1]: Started Virtual Machine qemu-1-instance-00000005.
Oct 10 06:14:25 np0005479823 nova_compute[235775]: 2025-10-10 10:14:25.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:25 np0005479823 ovn_controller[132503]: 2025-10-10T10:14:25Z|00029|binding|INFO|Setting lport be812d6f-78ad-4f90-9cd0-0ae2444e7f71 ovn-installed in OVS
Oct 10 06:14:25 np0005479823 ovn_controller[132503]: 2025-10-10T10:14:25Z|00030|binding|INFO|Setting lport be812d6f-78ad-4f90-9cd0-0ae2444e7f71 up in Southbound
Oct 10 06:14:25 np0005479823 nova_compute[235775]: 2025-10-10 10:14:25.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:25 np0005479823 nova_compute[235775]: 2025-10-10 10:14:25.956 2 DEBUG nova.compute.manager [req-63750100-6233-4a41-a18e-6fa5625a9fd0 req-680961cf-f09f-43b2-a6b5-46fda95083f1 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Received event network-vif-plugged-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct 10 06:14:25 np0005479823 nova_compute[235775]: 2025-10-10 10:14:25.956 2 DEBUG oslo_concurrency.lockutils [req-63750100-6233-4a41-a18e-6fa5625a9fd0 req-680961cf-f09f-43b2-a6b5-46fda95083f1 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:14:25 np0005479823 nova_compute[235775]: 2025-10-10 10:14:25.957 2 DEBUG oslo_concurrency.lockutils [req-63750100-6233-4a41-a18e-6fa5625a9fd0 req-680961cf-f09f-43b2-a6b5-46fda95083f1 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:14:25 np0005479823 nova_compute[235775]: 2025-10-10 10:14:25.957 2 DEBUG oslo_concurrency.lockutils [req-63750100-6233-4a41-a18e-6fa5625a9fd0 req-680961cf-f09f-43b2-a6b5-46fda95083f1 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:14:25 np0005479823 nova_compute[235775]: 2025-10-10 10:14:25.958 2 DEBUG nova.compute.manager [req-63750100-6233-4a41-a18e-6fa5625a9fd0 req-680961cf-f09f-43b2-a6b5-46fda95083f1 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Processing event network-vif-plugged-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct 10 06:14:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:26.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:26 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:26.207 141795 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Oct 10 06:14:26 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:26.208 141795 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp5qd_lucz/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Oct 10 06:14:26 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:26.100 241439 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct 10 06:14:26 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:26.104 241439 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct 10 06:14:26 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:26.107 241439 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Oct 10 06:14:26 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:26.108 241439 INFO oslo.privsep.daemon [-] privsep daemon running as pid 241439#033[00m
Oct 10 06:14:26 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:26.212 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[1e2a4b94-7753-4efe-97a6-ef25e3e01843]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:14:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:26 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:14:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:26 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:14:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Oct 10 06:14:26 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2445054076' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 06:14:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Oct 10 06:14:26 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2445054076' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 06:14:26 np0005479823 nova_compute[235775]: 2025-10-10 10:14:26.492 2 DEBUG nova.compute.manager [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct 10 06:14:26 np0005479823 nova_compute[235775]: 2025-10-10 10:14:26.493 2 DEBUG nova.virt.driver [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Emitting event <LifecycleEvent: 1760091266.4920683, f6ec6baf-a91e-4c7e-b1cf-b176d952068f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct 10 06:14:26 np0005479823 nova_compute[235775]: 2025-10-10 10:14:26.493 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] VM Started (Lifecycle Event)#033[00m
Oct 10 06:14:26 np0005479823 nova_compute[235775]: 2025-10-10 10:14:26.505 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct 10 06:14:26 np0005479823 nova_compute[235775]: 2025-10-10 10:14:26.509 2 INFO nova.virt.libvirt.driver [-] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Instance spawned successfully.#033[00m
Oct 10 06:14:26 np0005479823 nova_compute[235775]: 2025-10-10 10:14:26.509 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct 10 06:14:26 np0005479823 nova_compute[235775]: 2025-10-10 10:14:26.513 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct 10 06:14:26 np0005479823 nova_compute[235775]: 2025-10-10 10:14:26.516 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct 10 06:14:26 np0005479823 nova_compute[235775]: 2025-10-10 10:14:26.526 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct 10 06:14:26 np0005479823 nova_compute[235775]: 2025-10-10 10:14:26.526 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct 10 06:14:26 np0005479823 nova_compute[235775]: 2025-10-10 10:14:26.527 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct 10 06:14:26 np0005479823 nova_compute[235775]: 2025-10-10 10:14:26.527 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct 10 06:14:26 np0005479823 nova_compute[235775]: 2025-10-10 10:14:26.527 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct 10 06:14:26 np0005479823 nova_compute[235775]: 2025-10-10 10:14:26.528 2 DEBUG nova.virt.libvirt.driver [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct 10 06:14:26 np0005479823 nova_compute[235775]: 2025-10-10 10:14:26.548 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct 10 06:14:26 np0005479823 nova_compute[235775]: 2025-10-10 10:14:26.549 2 DEBUG nova.virt.driver [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Emitting event <LifecycleEvent: 1760091266.4930215, f6ec6baf-a91e-4c7e-b1cf-b176d952068f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct 10 06:14:26 np0005479823 nova_compute[235775]: 2025-10-10 10:14:26.549 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] VM Paused (Lifecycle Event)#033[00m
Oct 10 06:14:26 np0005479823 nova_compute[235775]: 2025-10-10 10:14:26.571 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct 10 06:14:26 np0005479823 nova_compute[235775]: 2025-10-10 10:14:26.575 2 DEBUG nova.virt.driver [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Emitting event <LifecycleEvent: 1760091266.495544, f6ec6baf-a91e-4c7e-b1cf-b176d952068f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct 10 06:14:26 np0005479823 nova_compute[235775]: 2025-10-10 10:14:26.575 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] VM Resumed (Lifecycle Event)#033[00m
Oct 10 06:14:26 np0005479823 nova_compute[235775]: 2025-10-10 10:14:26.582 2 INFO nova.compute.manager [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Took 8.28 seconds to spawn the instance on the hypervisor.#033[00m
Oct 10 06:14:26 np0005479823 nova_compute[235775]: 2025-10-10 10:14:26.583 2 DEBUG nova.compute.manager [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct 10 06:14:26 np0005479823 nova_compute[235775]: 2025-10-10 10:14:26.593 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct 10 06:14:26 np0005479823 nova_compute[235775]: 2025-10-10 10:14:26.598 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct 10 06:14:26 np0005479823 nova_compute[235775]: 2025-10-10 10:14:26.621 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct 10 06:14:26 np0005479823 nova_compute[235775]: 2025-10-10 10:14:26.655 2 INFO nova.compute.manager [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Took 9.49 seconds to build instance.#033[00m
Oct 10 06:14:26 np0005479823 nova_compute[235775]: 2025-10-10 10:14:26.677 2 DEBUG oslo_concurrency.lockutils [None req-3b82bcc5-8d06-434a-9def-ae70e6fcc604 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:14:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:26.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:26 np0005479823 podman[241447]: 2025-10-10 10:14:26.805078474 +0000 UTC m=+0.071484580 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct 10 06:14:26 np0005479823 podman[241449]: 2025-10-10 10:14:26.819227319 +0000 UTC m=+0.082743402 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 10 06:14:26 np0005479823 podman[241448]: 2025-10-10 10:14:26.822895357 +0000 UTC m=+0.089612983 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible)
Oct 10 06:14:26 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:26.954 241439 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:14:26 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:26.954 241439 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:14:26 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:26.954 241439 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:14:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:27 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:27 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e000a7e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:27 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:27 np0005479823 nova_compute[235775]: 2025-10-10 10:14:27.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:27 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:27.866 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[0d0b7062-ae11-445e-8982-237196222505]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:14:27 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:27.868 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc8850c4c-d1 in ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct 10 06:14:27 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:27.870 241439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc8850c4c-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct 10 06:14:27 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:27.870 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[7e87bbae-570d-40df-b30b-f7ea5e4116f3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:14:27 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:27.875 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[3749dbc3-30dc-46a6-ad15-97ebc956108a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:14:27 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:27.907 141908 DEBUG oslo.privsep.daemon [-] privsep: reply[1e4f8f2d-faa1-499f-a3b0-194e05985770]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:14:27 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:27.941 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[323322d2-c8a5-4cc0-9b99-5bd605869e85]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:14:27 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:27.944 141795 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp9fmkvg9o/privsep.sock']#033[00m
Oct 10 06:14:28 np0005479823 nova_compute[235775]: 2025-10-10 10:14:28.042 2 DEBUG nova.compute.manager [req-5797225d-ff67-490b-8725-0715b40b8676 req-52b2ff69-b60a-4e78-92fb-306415ec3933 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Received event network-vif-plugged-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct 10 06:14:28 np0005479823 nova_compute[235775]: 2025-10-10 10:14:28.043 2 DEBUG oslo_concurrency.lockutils [req-5797225d-ff67-490b-8725-0715b40b8676 req-52b2ff69-b60a-4e78-92fb-306415ec3933 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:14:28 np0005479823 nova_compute[235775]: 2025-10-10 10:14:28.043 2 DEBUG oslo_concurrency.lockutils [req-5797225d-ff67-490b-8725-0715b40b8676 req-52b2ff69-b60a-4e78-92fb-306415ec3933 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:14:28 np0005479823 nova_compute[235775]: 2025-10-10 10:14:28.044 2 DEBUG oslo_concurrency.lockutils [req-5797225d-ff67-490b-8725-0715b40b8676 req-52b2ff69-b60a-4e78-92fb-306415ec3933 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:14:28 np0005479823 nova_compute[235775]: 2025-10-10 10:14:28.044 2 DEBUG nova.compute.manager [req-5797225d-ff67-490b-8725-0715b40b8676 req-52b2ff69-b60a-4e78-92fb-306415ec3933 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] No waiting events found dispatching network-vif-plugged-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct 10 06:14:28 np0005479823 nova_compute[235775]: 2025-10-10 10:14:28.044 2 WARNING nova.compute.manager [req-5797225d-ff67-490b-8725-0715b40b8676 req-52b2ff69-b60a-4e78-92fb-306415ec3933 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Received unexpected event network-vif-plugged-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 for instance with vm_state active and task_state None.#033[00m
Oct 10 06:14:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:28.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:28 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:28.690 141795 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Oct 10 06:14:28 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:28.692 141795 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp9fmkvg9o/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Oct 10 06:14:28 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:28.581 241521 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct 10 06:14:28 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:28.587 241521 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct 10 06:14:28 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:28.590 241521 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Oct 10 06:14:28 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:28.591 241521 INFO oslo.privsep.daemon [-] privsep daemon running as pid 241521#033[00m
Oct 10 06:14:28 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:28.695 241521 DEBUG oslo.privsep.daemon [-] privsep: reply[88a17650-70b2-478e-bdf7-587900d068e9]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:14:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:28.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:29 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:29.218 241521 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:14:29 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:29.218 241521 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:14:29 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:29.218 241521 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:14:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:29 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e000a7e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:29 np0005479823 nova_compute[235775]: 2025-10-10 10:14:29.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:29 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:29.783 241521 DEBUG oslo.privsep.daemon [-] privsep: reply[301b1efe-413a-4940-bef6-789a925536df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:14:29 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:29.789 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[badf5fcc-fdde-45f3-8c8d-795605347709]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:14:29 np0005479823 NetworkManager[44866]: <info>  [1760091269.7906] manager: (tapc8850c4c-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/27)
Oct 10 06:14:29 np0005479823 systemd-udevd[241533]: Network interface NamePolicy= disabled on kernel command line.
Oct 10 06:14:29 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:29.822 241521 DEBUG oslo.privsep.daemon [-] privsep: reply[a9d98cc5-4ccd-445e-a92e-4653bdd24065]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:14:29 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:29.827 241521 DEBUG oslo.privsep.daemon [-] privsep: reply[f557582d-34f1-4ce5-890f-41c43154f1a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:14:29 np0005479823 NetworkManager[44866]: <info>  [1760091269.8580] device (tapc8850c4c-d0): carrier: link connected
Oct 10 06:14:29 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:29.863 241521 DEBUG oslo.privsep.daemon [-] privsep: reply[1c3e4673-1b12-47bb-bbc8-3347ec1d7632]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:14:29 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:29.887 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[62343da3-4007-4ed7-b09b-efb64de195d3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc8850c4c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2c:14:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 417341, 'reachable_time': 35993, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241552, 'error': None, 'target': 'ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:14:29 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:29.906 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[060d29f0-e432-4ef2-b0a3-44caf5ead820]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2c:1444'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 417341, 'tstamp': 417341}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241553, 'error': None, 'target': 'ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:14:29 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:29.924 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[3035d08a-b26c-41f3-b54d-6532e02e2962]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc8850c4c-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2c:14:44'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 417341, 'reachable_time': 35993, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241554, 'error': None, 'target': 'ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:14:29 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:29.956 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[165bddf4-2694-4ebd-a507-81bfdb258528]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:30.007 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[488118f4-9e0a-4e78-81f8-a9493fc87197]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:30.010 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc8850c4c-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:30.010 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:30.011 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc8850c4c-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:14:30 np0005479823 NetworkManager[44866]: <info>  [1760091270.0143] manager: (tapc8850c4c-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Oct 10 06:14:30 np0005479823 kernel: tapc8850c4c-d0: entered promiscuous mode
Oct 10 06:14:30 np0005479823 nova_compute[235775]: 2025-10-10 10:14:30.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:30 np0005479823 nova_compute[235775]: 2025-10-10 10:14:30.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:30.018 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc8850c4c-d0, col_values=(('external_ids', {'iface-id': '185907ee-d118-486d-93ad-c5a1b6a3a149'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:14:30 np0005479823 nova_compute[235775]: 2025-10-10 10:14:30.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:30 np0005479823 ovn_controller[132503]: 2025-10-10T10:14:30Z|00031|binding|INFO|Releasing lport 185907ee-d118-486d-93ad-c5a1b6a3a149 from this chassis (sb_readonly=0)
Oct 10 06:14:30 np0005479823 nova_compute[235775]: 2025-10-10 10:14:30.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:30.049 141795 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c8850c4c-dc38-4440-9c03-f2dd59684fe6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c8850c4c-dc38-4440-9c03-f2dd59684fe6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:30.050 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[e4cfeec1-5c41-4aa2-8c52-3a77df4c3853]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:14:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:30.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:30.052 141795 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]: global
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]:    log         /dev/log local0 debug
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]:    log-tag     haproxy-metadata-proxy-c8850c4c-dc38-4440-9c03-f2dd59684fe6
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]:    user        root
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]:    group       root
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]:    maxconn     1024
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]:    pidfile     /var/lib/neutron/external/pids/c8850c4c-dc38-4440-9c03-f2dd59684fe6.pid.haproxy
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]:    daemon
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]: 
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]: defaults
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]:    log global
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]:    mode http
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]:    option httplog
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]:    option dontlognull
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]:    option http-server-close
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]:    option forwardfor
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]:    retries                 3
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]:    timeout http-request    30s
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]:    timeout connect         30s
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]:    timeout client          32s
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]:    timeout server          32s
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]:    timeout http-keep-alive 30s
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]: 
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]: 
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]: listen listener
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]:    bind 169.254.169.254:80
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]:    server metadata /var/lib/neutron/metadata_proxy
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]:    http-request add-header X-OVN-Network-ID c8850c4c-dc38-4440-9c03-f2dd59684fe6
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct 10 06:14:30 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:30.052 141795 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6', 'env', 'PROCESS_TAG=haproxy-c8850c4c-dc38-4440-9c03-f2dd59684fe6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c8850c4c-dc38-4440-9c03-f2dd59684fe6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct 10 06:14:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:14:30 np0005479823 podman[241587]: 2025-10-10 10:14:30.420203979 +0000 UTC m=+0.054687620 container create 0c2879093a6023694f336748c0941d6a4da9f099ee7995a65deaaba60f8a76d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Oct 10 06:14:30 np0005479823 systemd[1]: Started libpod-conmon-0c2879093a6023694f336748c0941d6a4da9f099ee7995a65deaaba60f8a76d9.scope.
Oct 10 06:14:30 np0005479823 podman[241587]: 2025-10-10 10:14:30.389720729 +0000 UTC m=+0.024204390 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct 10 06:14:30 np0005479823 systemd[1]: Started libcrun container.
Oct 10 06:14:30 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62fcae525be285a1d8adf5d06c7c663fa56b70679788d48c992ce41c622e09da/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 10 06:14:30 np0005479823 podman[241587]: 2025-10-10 10:14:30.514960896 +0000 UTC m=+0.149444567 container init 0c2879093a6023694f336748c0941d6a4da9f099ee7995a65deaaba60f8a76d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct 10 06:14:30 np0005479823 podman[241587]: 2025-10-10 10:14:30.524128462 +0000 UTC m=+0.158612113 container start 0c2879093a6023694f336748c0941d6a4da9f099ee7995a65deaaba60f8a76d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct 10 06:14:30 np0005479823 neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6[241603]: [NOTICE]   (241607) : New worker (241609) forked
Oct 10 06:14:30 np0005479823 neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6[241603]: [NOTICE]   (241607) : Loading success.
Oct 10 06:14:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:30.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:31 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b0003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:31 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0d4002520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:31 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0b8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Oct 10 06:14:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:32.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:32 np0005479823 nova_compute[235775]: 2025-10-10 10:14:32.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:14:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:32.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:14:32 np0005479823 ovn_controller[132503]: 2025-10-10T10:14:32Z|00032|binding|INFO|Releasing lport 185907ee-d118-486d-93ad-c5a1b6a3a149 from this chassis (sb_readonly=0)
Oct 10 06:14:32 np0005479823 NetworkManager[44866]: <info>  [1760091272.9525] manager: (patch-br-int-to-provnet-1d90fa58-74cb-4ad4-84e0-739689a69111): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/29)
Oct 10 06:14:32 np0005479823 nova_compute[235775]: 2025-10-10 10:14:32.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:32 np0005479823 NetworkManager[44866]: <info>  [1760091272.9532] device (patch-br-int-to-provnet-1d90fa58-74cb-4ad4-84e0-739689a69111)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 06:14:32 np0005479823 NetworkManager[44866]: <info>  [1760091272.9541] manager: (patch-provnet-1d90fa58-74cb-4ad4-84e0-739689a69111-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/30)
Oct 10 06:14:32 np0005479823 NetworkManager[44866]: <info>  [1760091272.9544] device (patch-provnet-1d90fa58-74cb-4ad4-84e0-739689a69111-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 10 06:14:32 np0005479823 NetworkManager[44866]: <info>  [1760091272.9551] manager: (patch-br-int-to-provnet-1d90fa58-74cb-4ad4-84e0-739689a69111): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Oct 10 06:14:32 np0005479823 NetworkManager[44866]: <info>  [1760091272.9555] manager: (patch-provnet-1d90fa58-74cb-4ad4-84e0-739689a69111-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Oct 10 06:14:32 np0005479823 NetworkManager[44866]: <info>  [1760091272.9558] device (patch-br-int-to-provnet-1d90fa58-74cb-4ad4-84e0-739689a69111)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct 10 06:14:32 np0005479823 NetworkManager[44866]: <info>  [1760091272.9564] device (patch-provnet-1d90fa58-74cb-4ad4-84e0-739689a69111-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct 10 06:14:32 np0005479823 ovn_controller[132503]: 2025-10-10T10:14:32Z|00033|binding|INFO|Releasing lport 185907ee-d118-486d-93ad-c5a1b6a3a149 from this chassis (sb_readonly=0)
Oct 10 06:14:32 np0005479823 nova_compute[235775]: 2025-10-10 10:14:32.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:32 np0005479823 nova_compute[235775]: 2025-10-10 10:14:32.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[240464]: 10/10/2025 10:14:33 : epoch 68e8dc3c : compute-2 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd0e000a7e0 fd 39 proxy ignored for local
Oct 10 06:14:33 np0005479823 kernel: ganesha.nfsd[240938]: segfault at 50 ip 00007fd18c6b432e sp 00007fd145ffa210 error 4 in libntirpc.so.5.8[7fd18c699000+2c000] likely on CPU 3 (core 0, socket 3)
Oct 10 06:14:33 np0005479823 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Oct 10 06:14:33 np0005479823 systemd[1]: Started Process Core Dump (PID 241626/UID 0).
Oct 10 06:14:33 np0005479823 nova_compute[235775]: 2025-10-10 10:14:33.228 2 DEBUG nova.compute.manager [req-19ce4933-f38f-408c-9357-3a253a809833 req-de768135-8280-4d04-86f1-d35a9913b086 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Received event network-changed-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct 10 06:14:33 np0005479823 nova_compute[235775]: 2025-10-10 10:14:33.229 2 DEBUG nova.compute.manager [req-19ce4933-f38f-408c-9357-3a253a809833 req-de768135-8280-4d04-86f1-d35a9913b086 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Refreshing instance network info cache due to event network-changed-be812d6f-78ad-4f90-9cd0-0ae2444e7f71. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct 10 06:14:33 np0005479823 nova_compute[235775]: 2025-10-10 10:14:33.229 2 DEBUG oslo_concurrency.lockutils [req-19ce4933-f38f-408c-9357-3a253a809833 req-de768135-8280-4d04-86f1-d35a9913b086 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "refresh_cache-f6ec6baf-a91e-4c7e-b1cf-b176d952068f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct 10 06:14:33 np0005479823 nova_compute[235775]: 2025-10-10 10:14:33.229 2 DEBUG oslo_concurrency.lockutils [req-19ce4933-f38f-408c-9357-3a253a809833 req-de768135-8280-4d04-86f1-d35a9913b086 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquired lock "refresh_cache-f6ec6baf-a91e-4c7e-b1cf-b176d952068f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct 10 06:14:33 np0005479823 nova_compute[235775]: 2025-10-10 10:14:33.230 2 DEBUG nova.network.neutron [req-19ce4933-f38f-408c-9357-3a253a809833 req-de768135-8280-4d04-86f1-d35a9913b086 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Refreshing network info cache for port be812d6f-78ad-4f90-9cd0-0ae2444e7f71 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct 10 06:14:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:34.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:34 np0005479823 nova_compute[235775]: 2025-10-10 10:14:34.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:34 np0005479823 systemd-coredump[241627]: Process 240468 (ganesha.nfsd) of user 0 dumped core.#012#012Stack trace of thread 58:#012#0  0x00007fd18c6b432e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)#012ELF object binary architecture: AMD x86-64
Oct 10 06:14:34 np0005479823 nova_compute[235775]: 2025-10-10 10:14:34.671 2 DEBUG nova.network.neutron [req-19ce4933-f38f-408c-9357-3a253a809833 req-de768135-8280-4d04-86f1-d35a9913b086 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Updated VIF entry in instance network info cache for port be812d6f-78ad-4f90-9cd0-0ae2444e7f71. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct 10 06:14:34 np0005479823 nova_compute[235775]: 2025-10-10 10:14:34.671 2 DEBUG nova.network.neutron [req-19ce4933-f38f-408c-9357-3a253a809833 req-de768135-8280-4d04-86f1-d35a9913b086 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Updating instance_info_cache with network_info: [{"id": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "address": "fa:16:3e:35:91:37", "network": {"id": "c8850c4c-dc38-4440-9c03-f2dd59684fe6", "bridge": "br-int", "label": "tempest-network-smoke--781461532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe812d6f-78", "ovs_interfaceid": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct 10 06:14:34 np0005479823 nova_compute[235775]: 2025-10-10 10:14:34.691 2 DEBUG oslo_concurrency.lockutils [req-19ce4933-f38f-408c-9357-3a253a809833 req-de768135-8280-4d04-86f1-d35a9913b086 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Releasing lock "refresh_cache-f6ec6baf-a91e-4c7e-b1cf-b176d952068f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct 10 06:14:34 np0005479823 systemd[1]: systemd-coredump@11-241626-0.service: Deactivated successfully.
Oct 10 06:14:34 np0005479823 systemd[1]: systemd-coredump@11-241626-0.service: Consumed 1.474s CPU time.
Oct 10 06:14:34 np0005479823 podman[241684]: 2025-10-10 10:14:34.761434837 +0000 UTC m=+0.029067985 container died 846cd6823111ee42a70b41700b1a43ae41b27e9f805d56155411b3444ac3e4da (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20250325, ceph=True, io.buildah.version=1.40.1)
Oct 10 06:14:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:34 np0005479823 systemd[1]: var-lib-containers-storage-overlay-8f685c822357fb25a63d78c0de3edff79157420b24cde6f68449c7f664af3204-merged.mount: Deactivated successfully.
Oct 10 06:14:34 np0005479823 podman[241684]: 2025-10-10 10:14:34.809365579 +0000 UTC m=+0.076998697 container remove 846cd6823111ee42a70b41700b1a43ae41b27e9f805d56155411b3444ac3e4da (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 06:14:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:34.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:34 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Main process exited, code=exited, status=139/n/a
Oct 10 06:14:34 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Failed with result 'exit-code'.
Oct 10 06:14:34 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.298s CPU time.
Oct 10 06:14:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:14:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:35 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:14:35 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:14:35 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:14:35 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:14:35 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 06:14:35 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:14:35 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:14:35 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 06:14:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:36.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:36.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:37 np0005479823 nova_compute[235775]: 2025-10-10 10:14:37.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:38.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:38 np0005479823 ovn_controller[132503]: 2025-10-10T10:14:38Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:35:91:37 10.100.0.11
Oct 10 06:14:38 np0005479823 ovn_controller[132503]: 2025-10-10T10:14:38Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:35:91:37 10.100.0.11
Oct 10 06:14:38 np0005479823 podman[241789]: 2025-10-10 10:14:38.632515956 +0000 UTC m=+0.048160240 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true)
Oct 10 06:14:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:14:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:38.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:14:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [WARNING] 282/101439 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Oct 10 06:14:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [NOTICE] 282/101439 (4) : haproxy version is 2.3.17-d1c9119
Oct 10 06:14:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [NOTICE] 282/101439 (4) : path to executable is /usr/local/sbin/haproxy
Oct 10 06:14:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol[85474]: [ALERT] 282/101439 (4) : backend 'backend' has no server available!
Oct 10 06:14:39 np0005479823 nova_compute[235775]: 2025-10-10 10:14:39.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:40.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:14:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:14:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:40.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:14:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:41.467 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:14:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:41.468 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:14:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:41.468 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:14:41 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:14:41 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:14:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:42.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:42 np0005479823 nova_compute[235775]: 2025-10-10 10:14:42.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:14:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:42.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:14:43 np0005479823 nova_compute[235775]: 2025-10-10 10:14:43.719 2 INFO nova.compute.manager [None req-b91e469f-aff6-42b0-9240-485040d841ba 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Get console output#033[00m
Oct 10 06:14:43 np0005479823 nova_compute[235775]: 2025-10-10 10:14:43.725 2 INFO oslo.privsep.daemon [None req-b91e469f-aff6-42b0-9240-485040d841ba 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp3bylatxb/privsep.sock']#033[00m
Oct 10 06:14:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:14:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:44.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:14:44 np0005479823 nova_compute[235775]: 2025-10-10 10:14:44.372 2 INFO oslo.privsep.daemon [None req-b91e469f-aff6-42b0-9240-485040d841ba 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Oct 10 06:14:44 np0005479823 nova_compute[235775]: 2025-10-10 10:14:44.254 763 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct 10 06:14:44 np0005479823 nova_compute[235775]: 2025-10-10 10:14:44.258 763 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct 10 06:14:44 np0005479823 nova_compute[235775]: 2025-10-10 10:14:44.260 763 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Oct 10 06:14:44 np0005479823 nova_compute[235775]: 2025-10-10 10:14:44.260 763 INFO oslo.privsep.daemon [-] privsep daemon running as pid 763#033[00m
Oct 10 06:14:44 np0005479823 nova_compute[235775]: 2025-10-10 10:14:44.457 763 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct 10 06:14:44 np0005479823 nova_compute[235775]: 2025-10-10 10:14:44.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:44 np0005479823 nova_compute[235775]: 2025-10-10 10:14:44.808 2 DEBUG oslo_concurrency.lockutils [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:14:44 np0005479823 nova_compute[235775]: 2025-10-10 10:14:44.809 2 DEBUG oslo_concurrency.lockutils [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:14:44 np0005479823 nova_compute[235775]: 2025-10-10 10:14:44.809 2 DEBUG oslo_concurrency.lockutils [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:14:44 np0005479823 nova_compute[235775]: 2025-10-10 10:14:44.810 2 DEBUG oslo_concurrency.lockutils [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:14:44 np0005479823 nova_compute[235775]: 2025-10-10 10:14:44.810 2 DEBUG oslo_concurrency.lockutils [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:14:44 np0005479823 nova_compute[235775]: 2025-10-10 10:14:44.812 2 INFO nova.compute.manager [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Terminating instance#033[00m
Oct 10 06:14:44 np0005479823 nova_compute[235775]: 2025-10-10 10:14:44.813 2 DEBUG nova.compute.manager [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct 10 06:14:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:44.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:44 np0005479823 kernel: tapbe812d6f-78 (unregistering): left promiscuous mode
Oct 10 06:14:44 np0005479823 NetworkManager[44866]: <info>  [1760091284.8621] device (tapbe812d6f-78): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 10 06:14:44 np0005479823 ovn_controller[132503]: 2025-10-10T10:14:44Z|00034|binding|INFO|Releasing lport be812d6f-78ad-4f90-9cd0-0ae2444e7f71 from this chassis (sb_readonly=0)
Oct 10 06:14:44 np0005479823 ovn_controller[132503]: 2025-10-10T10:14:44Z|00035|binding|INFO|Setting lport be812d6f-78ad-4f90-9cd0-0ae2444e7f71 down in Southbound
Oct 10 06:14:44 np0005479823 nova_compute[235775]: 2025-10-10 10:14:44.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:44 np0005479823 ovn_controller[132503]: 2025-10-10T10:14:44Z|00036|binding|INFO|Removing iface tapbe812d6f-78 ovn-installed in OVS
Oct 10 06:14:44 np0005479823 nova_compute[235775]: 2025-10-10 10:14:44.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:44 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:44.879 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:91:37 10.100.0.11'], port_security=['fa:16:3e:35:91:37 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'f6ec6baf-a91e-4c7e-b1cf-b176d952068f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c8850c4c-dc38-4440-9c03-f2dd59684fe6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5e531d4b440422d946eaf6fd4e166f7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2502283d-b38d-456e-8e7f-133a87baf32b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.181'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=21e2152f-e965-46e3-9774-988f8fdf189b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa4a23cd910>], logical_port=be812d6f-78ad-4f90-9cd0-0ae2444e7f71) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa4a23cd910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct 10 06:14:44 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:44.880 141795 INFO neutron.agent.ovn.metadata.agent [-] Port be812d6f-78ad-4f90-9cd0-0ae2444e7f71 in datapath c8850c4c-dc38-4440-9c03-f2dd59684fe6 unbound from our chassis#033[00m
Oct 10 06:14:44 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:44.881 141795 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c8850c4c-dc38-4440-9c03-f2dd59684fe6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct 10 06:14:44 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:44.882 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[9211e87c-afc0-49c3-bb2c-e1e0a7b3dd81]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:14:44 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:44.882 141795 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6 namespace which is not needed anymore#033[00m
Oct 10 06:14:44 np0005479823 nova_compute[235775]: 2025-10-10 10:14:44.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:44 np0005479823 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000005.scope: Deactivated successfully.
Oct 10 06:14:44 np0005479823 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000005.scope: Consumed 12.673s CPU time.
Oct 10 06:14:44 np0005479823 systemd-machined[192768]: Machine qemu-1-instance-00000005 terminated.
Oct 10 06:14:45 np0005479823 neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6[241603]: [NOTICE]   (241607) : haproxy version is 2.8.14-c23fe91
Oct 10 06:14:45 np0005479823 neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6[241603]: [NOTICE]   (241607) : path to executable is /usr/sbin/haproxy
Oct 10 06:14:45 np0005479823 neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6[241603]: [WARNING]  (241607) : Exiting Master process...
Oct 10 06:14:45 np0005479823 neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6[241603]: [WARNING]  (241607) : Exiting Master process...
Oct 10 06:14:45 np0005479823 neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6[241603]: [ALERT]    (241607) : Current worker (241609) exited with code 143 (Terminated)
Oct 10 06:14:45 np0005479823 neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6[241603]: [WARNING]  (241607) : All workers exited. Exiting... (0)
Oct 10 06:14:45 np0005479823 systemd[1]: libpod-0c2879093a6023694f336748c0941d6a4da9f099ee7995a65deaaba60f8a76d9.scope: Deactivated successfully.
Oct 10 06:14:45 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Scheduled restart job, restart counter is at 12.
Oct 10 06:14:45 np0005479823 systemd[1]: Stopped Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 06:14:45 np0005479823 systemd[1]: ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4@nfs.cephfs.1.0.compute-2.boccfy.service: Consumed 1.298s CPU time.
Oct 10 06:14:45 np0005479823 nova_compute[235775]: 2025-10-10 10:14:45.057 2 INFO nova.virt.libvirt.driver [-] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Instance destroyed successfully.#033[00m
Oct 10 06:14:45 np0005479823 nova_compute[235775]: 2025-10-10 10:14:45.058 2 DEBUG nova.objects.instance [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lazy-loading 'resources' on Instance uuid f6ec6baf-a91e-4c7e-b1cf-b176d952068f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct 10 06:14:45 np0005479823 podman[241873]: 2025-10-10 10:14:45.060252725 +0000 UTC m=+0.055451636 container died 0c2879093a6023694f336748c0941d6a4da9f099ee7995a65deaaba60f8a76d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct 10 06:14:45 np0005479823 systemd[1]: Starting Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4...
Oct 10 06:14:45 np0005479823 nova_compute[235775]: 2025-10-10 10:14:45.071 2 DEBUG nova.virt.libvirt.vif [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-10T10:14:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1362038391',display_name='tempest-TestNetworkBasicOps-server-1362038391',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1362038391',id=5,image_ref='5ae78700-970d-45b4-a57d-978a054c7519',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOaO/Dm5TZJdJA+p0WorpE1s/wHDKiboiIskSllf2vhdjUj1oz81caVPGQVtZrwI+VVMAczLEmtRNwhb15+QK4so2BghvGEI3ChmYsvOZuU3tzU+nN+IQyotPE2q48Vw5A==',key_name='tempest-TestNetworkBasicOps-804562104',keypairs=<?>,launch_index=0,launched_at=2025-10-10T10:14:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d5e531d4b440422d946eaf6fd4e166f7',ramdisk_id='',reservation_id='r-ksfjfy6b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5ae78700-970d-45b4-a57d-978a054c7519',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-188749107',owner_user_name='tempest-TestNetworkBasicOps-188749107-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-10T10:14:26Z,user_data=None,user_id='7956778c03764aaf8906c9b435337976',uuid=f6ec6baf-a91e-4c7e-b1cf-b176d952068f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "address": "fa:16:3e:35:91:37", "network": {"id": "c8850c4c-dc38-4440-9c03-f2dd59684fe6", "bridge": "br-int", "label": "tempest-network-smoke--781461532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe812d6f-78", "ovs_interfaceid": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct 10 06:14:45 np0005479823 nova_compute[235775]: 2025-10-10 10:14:45.071 2 DEBUG nova.network.os_vif_util [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converting VIF {"id": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "address": "fa:16:3e:35:91:37", "network": {"id": "c8850c4c-dc38-4440-9c03-f2dd59684fe6", "bridge": "br-int", "label": "tempest-network-smoke--781461532", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe812d6f-78", "ovs_interfaceid": "be812d6f-78ad-4f90-9cd0-0ae2444e7f71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct 10 06:14:45 np0005479823 nova_compute[235775]: 2025-10-10 10:14:45.072 2 DEBUG nova.network.os_vif_util [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:35:91:37,bridge_name='br-int',has_traffic_filtering=True,id=be812d6f-78ad-4f90-9cd0-0ae2444e7f71,network=Network(c8850c4c-dc38-4440-9c03-f2dd59684fe6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe812d6f-78') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct 10 06:14:45 np0005479823 nova_compute[235775]: 2025-10-10 10:14:45.072 2 DEBUG os_vif [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:35:91:37,bridge_name='br-int',has_traffic_filtering=True,id=be812d6f-78ad-4f90-9cd0-0ae2444e7f71,network=Network(c8850c4c-dc38-4440-9c03-f2dd59684fe6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe812d6f-78') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct 10 06:14:45 np0005479823 nova_compute[235775]: 2025-10-10 10:14:45.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:45 np0005479823 nova_compute[235775]: 2025-10-10 10:14:45.074 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbe812d6f-78, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:14:45 np0005479823 nova_compute[235775]: 2025-10-10 10:14:45.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:45 np0005479823 nova_compute[235775]: 2025-10-10 10:14:45.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:45 np0005479823 nova_compute[235775]: 2025-10-10 10:14:45.081 2 INFO os_vif [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:35:91:37,bridge_name='br-int',has_traffic_filtering=True,id=be812d6f-78ad-4f90-9cd0-0ae2444e7f71,network=Network(c8850c4c-dc38-4440-9c03-f2dd59684fe6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe812d6f-78')#033[00m
Oct 10 06:14:45 np0005479823 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0c2879093a6023694f336748c0941d6a4da9f099ee7995a65deaaba60f8a76d9-userdata-shm.mount: Deactivated successfully.
Oct 10 06:14:45 np0005479823 systemd[1]: var-lib-containers-storage-overlay-62fcae525be285a1d8adf5d06c7c663fa56b70679788d48c992ce41c622e09da-merged.mount: Deactivated successfully.
Oct 10 06:14:45 np0005479823 podman[241873]: 2025-10-10 10:14:45.109682424 +0000 UTC m=+0.104881315 container cleanup 0c2879093a6023694f336748c0941d6a4da9f099ee7995a65deaaba60f8a76d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 06:14:45 np0005479823 systemd[1]: libpod-conmon-0c2879093a6023694f336748c0941d6a4da9f099ee7995a65deaaba60f8a76d9.scope: Deactivated successfully.
Oct 10 06:14:45 np0005479823 podman[241937]: 2025-10-10 10:14:45.181537625 +0000 UTC m=+0.046614700 container remove 0c2879093a6023694f336748c0941d6a4da9f099ee7995a65deaaba60f8a76d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 10 06:14:45 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:45.189 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[b69bca29-3bc5-485e-be49-ff10788a70fe]: (4, ('Fri Oct 10 10:14:44 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6 (0c2879093a6023694f336748c0941d6a4da9f099ee7995a65deaaba60f8a76d9)\n0c2879093a6023694f336748c0941d6a4da9f099ee7995a65deaaba60f8a76d9\nFri Oct 10 10:14:45 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6 (0c2879093a6023694f336748c0941d6a4da9f099ee7995a65deaaba60f8a76d9)\n0c2879093a6023694f336748c0941d6a4da9f099ee7995a65deaaba60f8a76d9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:14:45 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:45.191 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[f1d2c09b-3238-4834-b48c-0ab3c002ca37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:14:45 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:45.191 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc8850c4c-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:14:45 np0005479823 nova_compute[235775]: 2025-10-10 10:14:45.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:45 np0005479823 kernel: tapc8850c4c-d0: left promiscuous mode
Oct 10 06:14:45 np0005479823 nova_compute[235775]: 2025-10-10 10:14:45.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:45 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:45.214 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[5b2b9461-327c-444c-89e0-b5a18a42039c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:14:45 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:45.242 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[07148111-9e27-4dfd-8eb4-19d999636c37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:14:45 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:45.243 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[4109441a-534a-4cf8-bc7a-2b6fa38dbf47]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:14:45 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:45.259 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[783c3487-faeb-4e98-822e-3c2469bdaea9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 417333, 'reachable_time': 39281, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241982, 'error': None, 'target': 'ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:14:45 np0005479823 systemd[1]: run-netns-ovnmeta\x2dc8850c4c\x2ddc38\x2d4440\x2d9c03\x2df2dd59684fe6.mount: Deactivated successfully.
Oct 10 06:14:45 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:45.274 141908 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c8850c4c-dc38-4440-9c03-f2dd59684fe6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct 10 06:14:45 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:45.275 141908 DEBUG oslo.privsep.daemon [-] privsep: reply[237005b9-421f-4741-b0b4-5492ab726e3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:14:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:14:45 np0005479823 podman[241989]: 2025-10-10 10:14:45.366574337 +0000 UTC m=+0.065863319 container create eac346131ad153d129d5755e1377a2007627c03598a265a99b9e06d18355c13f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 06:14:45 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb5c007710ca0236f76704862fb485066d7094fed5c8c0496d6985ebf3d17e39/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Oct 10 06:14:45 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb5c007710ca0236f76704862fb485066d7094fed5c8c0496d6985ebf3d17e39/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 06:14:45 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb5c007710ca0236f76704862fb485066d7094fed5c8c0496d6985ebf3d17e39/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 06:14:45 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb5c007710ca0236f76704862fb485066d7094fed5c8c0496d6985ebf3d17e39/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.1.0.compute-2.boccfy-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Oct 10 06:14:45 np0005479823 podman[241989]: 2025-10-10 10:14:45.342291726 +0000 UTC m=+0.041580738 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 06:14:45 np0005479823 podman[241989]: 2025-10-10 10:14:45.436961731 +0000 UTC m=+0.136250733 container init eac346131ad153d129d5755e1377a2007627c03598a265a99b9e06d18355c13f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.license=GPLv2)
Oct 10 06:14:45 np0005479823 podman[241989]: 2025-10-10 10:14:45.444051349 +0000 UTC m=+0.143340331 container start eac346131ad153d129d5755e1377a2007627c03598a265a99b9e06d18355c13f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 06:14:45 np0005479823 bash[241989]: eac346131ad153d129d5755e1377a2007627c03598a265a99b9e06d18355c13f
Oct 10 06:14:45 np0005479823 systemd[1]: Started Ceph nfs.cephfs.1.0.compute-2.boccfy for 21f084a3-af34-5230-afe4-ea5cd24a55f4.
Oct 10 06:14:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Oct 10 06:14:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Oct 10 06:14:45 np0005479823 nova_compute[235775]: 2025-10-10 10:14:45.500 2 INFO nova.virt.libvirt.driver [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Deleting instance files /var/lib/nova/instances/f6ec6baf-a91e-4c7e-b1cf-b176d952068f_del#033[00m
Oct 10 06:14:45 np0005479823 nova_compute[235775]: 2025-10-10 10:14:45.501 2 INFO nova.virt.libvirt.driver [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Deletion of /var/lib/nova/instances/f6ec6baf-a91e-4c7e-b1cf-b176d952068f_del complete#033[00m
Oct 10 06:14:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Oct 10 06:14:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Oct 10 06:14:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Oct 10 06:14:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Oct 10 06:14:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Oct 10 06:14:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:14:45 np0005479823 nova_compute[235775]: 2025-10-10 10:14:45.574 2 DEBUG nova.virt.libvirt.host [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Oct 10 06:14:45 np0005479823 nova_compute[235775]: 2025-10-10 10:14:45.574 2 INFO nova.virt.libvirt.host [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] UEFI support detected#033[00m
Oct 10 06:14:45 np0005479823 nova_compute[235775]: 2025-10-10 10:14:45.576 2 INFO nova.compute.manager [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Took 0.76 seconds to destroy the instance on the hypervisor.#033[00m
Oct 10 06:14:45 np0005479823 nova_compute[235775]: 2025-10-10 10:14:45.577 2 DEBUG oslo.service.loopingcall [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct 10 06:14:45 np0005479823 nova_compute[235775]: 2025-10-10 10:14:45.577 2 DEBUG nova.compute.manager [-] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct 10 06:14:45 np0005479823 nova_compute[235775]: 2025-10-10 10:14:45.577 2 DEBUG nova.network.neutron [-] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct 10 06:14:45 np0005479823 nova_compute[235775]: 2025-10-10 10:14:45.619 2 DEBUG nova.compute.manager [req-32145f94-bd9a-406b-bec0-2635922ef7c8 req-9ea60e83-9c01-4e4d-9fdb-f0d421d02cca 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Received event network-vif-unplugged-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct 10 06:14:45 np0005479823 nova_compute[235775]: 2025-10-10 10:14:45.619 2 DEBUG oslo_concurrency.lockutils [req-32145f94-bd9a-406b-bec0-2635922ef7c8 req-9ea60e83-9c01-4e4d-9fdb-f0d421d02cca 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:14:45 np0005479823 nova_compute[235775]: 2025-10-10 10:14:45.619 2 DEBUG oslo_concurrency.lockutils [req-32145f94-bd9a-406b-bec0-2635922ef7c8 req-9ea60e83-9c01-4e4d-9fdb-f0d421d02cca 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:14:45 np0005479823 nova_compute[235775]: 2025-10-10 10:14:45.620 2 DEBUG oslo_concurrency.lockutils [req-32145f94-bd9a-406b-bec0-2635922ef7c8 req-9ea60e83-9c01-4e4d-9fdb-f0d421d02cca 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:14:45 np0005479823 nova_compute[235775]: 2025-10-10 10:14:45.620 2 DEBUG nova.compute.manager [req-32145f94-bd9a-406b-bec0-2635922ef7c8 req-9ea60e83-9c01-4e4d-9fdb-f0d421d02cca 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] No waiting events found dispatching network-vif-unplugged-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct 10 06:14:45 np0005479823 nova_compute[235775]: 2025-10-10 10:14:45.620 2 DEBUG nova.compute.manager [req-32145f94-bd9a-406b-bec0-2635922ef7c8 req-9ea60e83-9c01-4e4d-9fdb-f0d421d02cca 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Received event network-vif-unplugged-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct 10 06:14:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:45 np0005479823 nova_compute[235775]: 2025-10-10 10:14:45.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:45 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:45.776 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'da:dc:6a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '42:2f:dd:4e:d8:41'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct 10 06:14:45 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:45.778 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct 10 06:14:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:46.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:14:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:46.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.065011) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091287065056, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 2358, "num_deletes": 251, "total_data_size": 6164593, "memory_usage": 6261456, "flush_reason": "Manual Compaction"}
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091287085962, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 3994523, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 26113, "largest_seqno": 28466, "table_properties": {"data_size": 3985243, "index_size": 5774, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19477, "raw_average_key_size": 20, "raw_value_size": 3966615, "raw_average_value_size": 4119, "num_data_blocks": 254, "num_entries": 963, "num_filter_entries": 963, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760091080, "oldest_key_time": 1760091080, "file_creation_time": 1760091287, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 21039 microseconds, and 8398 cpu microseconds.
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.086045) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 3994523 bytes OK
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.086078) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.087636) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.087662) EVENT_LOG_v1 {"time_micros": 1760091287087654, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.087694) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 6154232, prev total WAL file size 6154232, number of live WAL files 2.
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.089441) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032303038' seq:72057594037927935, type:22 .. '7061786F730032323630' seq:0, type:0; will stop at (end)
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(3900KB)], [51(11MB)]
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091287089496, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 16447771, "oldest_snapshot_seqno": -1}
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 5813 keys, 14325919 bytes, temperature: kUnknown
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091287167539, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 14325919, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14286658, "index_size": 23599, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14597, "raw_key_size": 147737, "raw_average_key_size": 25, "raw_value_size": 14181276, "raw_average_value_size": 2439, "num_data_blocks": 964, "num_entries": 5813, "num_filter_entries": 5813, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760091287, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.167820) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 14325919 bytes
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.169189) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 210.5 rd, 183.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.8, 11.9 +0.0 blob) out(13.7 +0.0 blob), read-write-amplify(7.7) write-amplify(3.6) OK, records in: 6331, records dropped: 518 output_compression: NoCompression
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.169215) EVENT_LOG_v1 {"time_micros": 1760091287169203, "job": 30, "event": "compaction_finished", "compaction_time_micros": 78119, "compaction_time_cpu_micros": 25940, "output_level": 6, "num_output_files": 1, "total_output_size": 14325919, "num_input_records": 6331, "num_output_records": 5813, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091287170357, "job": 30, "event": "table_file_deletion", "file_number": 53}
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091287173358, "job": 30, "event": "table_file_deletion", "file_number": 51}
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.089355) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.173414) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.173418) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.173420) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.173422) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:14:47 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:14:47.173424) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:14:47 np0005479823 nova_compute[235775]: 2025-10-10 10:14:47.587 2 DEBUG nova.network.neutron [-] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct 10 06:14:47 np0005479823 nova_compute[235775]: 2025-10-10 10:14:47.623 2 INFO nova.compute.manager [-] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Took 2.05 seconds to deallocate network for instance.#033[00m
Oct 10 06:14:47 np0005479823 nova_compute[235775]: 2025-10-10 10:14:47.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:47 np0005479823 nova_compute[235775]: 2025-10-10 10:14:47.695 2 DEBUG oslo_concurrency.lockutils [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:14:47 np0005479823 nova_compute[235775]: 2025-10-10 10:14:47.695 2 DEBUG oslo_concurrency.lockutils [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:14:47 np0005479823 nova_compute[235775]: 2025-10-10 10:14:47.739 2 DEBUG nova.compute.manager [req-221d6b6f-5307-4020-ad89-2d261fb34447 req-4fa740e5-22c1-4b55-9c0c-f99cd6524854 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Received event network-vif-plugged-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct 10 06:14:47 np0005479823 nova_compute[235775]: 2025-10-10 10:14:47.739 2 DEBUG oslo_concurrency.lockutils [req-221d6b6f-5307-4020-ad89-2d261fb34447 req-4fa740e5-22c1-4b55-9c0c-f99cd6524854 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:14:47 np0005479823 nova_compute[235775]: 2025-10-10 10:14:47.740 2 DEBUG oslo_concurrency.lockutils [req-221d6b6f-5307-4020-ad89-2d261fb34447 req-4fa740e5-22c1-4b55-9c0c-f99cd6524854 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:14:47 np0005479823 nova_compute[235775]: 2025-10-10 10:14:47.740 2 DEBUG oslo_concurrency.lockutils [req-221d6b6f-5307-4020-ad89-2d261fb34447 req-4fa740e5-22c1-4b55-9c0c-f99cd6524854 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:14:47 np0005479823 nova_compute[235775]: 2025-10-10 10:14:47.740 2 DEBUG nova.compute.manager [req-221d6b6f-5307-4020-ad89-2d261fb34447 req-4fa740e5-22c1-4b55-9c0c-f99cd6524854 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] No waiting events found dispatching network-vif-plugged-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct 10 06:14:47 np0005479823 nova_compute[235775]: 2025-10-10 10:14:47.740 2 WARNING nova.compute.manager [req-221d6b6f-5307-4020-ad89-2d261fb34447 req-4fa740e5-22c1-4b55-9c0c-f99cd6524854 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Received unexpected event network-vif-plugged-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 for instance with vm_state deleted and task_state None.#033[00m
Oct 10 06:14:47 np0005479823 nova_compute[235775]: 2025-10-10 10:14:47.740 2 DEBUG nova.compute.manager [req-221d6b6f-5307-4020-ad89-2d261fb34447 req-4fa740e5-22c1-4b55-9c0c-f99cd6524854 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Received event network-vif-deleted-be812d6f-78ad-4f90-9cd0-0ae2444e7f71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct 10 06:14:47 np0005479823 nova_compute[235775]: 2025-10-10 10:14:47.762 2 DEBUG oslo_concurrency.processutils [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:14:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:48.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:48 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:14:48 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1817080526' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:14:48 np0005479823 nova_compute[235775]: 2025-10-10 10:14:48.194 2 DEBUG oslo_concurrency.processutils [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:14:48 np0005479823 nova_compute[235775]: 2025-10-10 10:14:48.200 2 DEBUG nova.compute.provider_tree [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Updating inventory in ProviderTree for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct 10 06:14:48 np0005479823 nova_compute[235775]: 2025-10-10 10:14:48.245 2 ERROR nova.scheduler.client.report [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [req-469b8aff-522c-4b7a-a079-dfcb7da00766] Failed to update inventory to [{'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID dcdfa54c-9f95-46da-9af1-da3e28d81cf0.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-469b8aff-522c-4b7a-a079-dfcb7da00766"}]}#033[00m
Oct 10 06:14:48 np0005479823 nova_compute[235775]: 2025-10-10 10:14:48.270 2 DEBUG nova.scheduler.client.report [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Refreshing inventories for resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct 10 06:14:48 np0005479823 nova_compute[235775]: 2025-10-10 10:14:48.296 2 DEBUG nova.scheduler.client.report [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Updating ProviderTree inventory for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct 10 06:14:48 np0005479823 nova_compute[235775]: 2025-10-10 10:14:48.297 2 DEBUG nova.compute.provider_tree [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Updating inventory in ProviderTree for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct 10 06:14:48 np0005479823 nova_compute[235775]: 2025-10-10 10:14:48.314 2 DEBUG nova.scheduler.client.report [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Refreshing aggregate associations for resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct 10 06:14:48 np0005479823 nova_compute[235775]: 2025-10-10 10:14:48.349 2 DEBUG nova.scheduler.client.report [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Refreshing trait associations for resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0, traits: HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_FMA3,HW_CPU_X86_SVM,HW_CPU_X86_F16C,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_CLMUL,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_BMI2,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct 10 06:14:48 np0005479823 nova_compute[235775]: 2025-10-10 10:14:48.397 2 DEBUG oslo_concurrency.processutils [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:14:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:48 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:14:48 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2226505712' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:14:48 np0005479823 nova_compute[235775]: 2025-10-10 10:14:48.822 2 DEBUG oslo_concurrency.processutils [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:14:48 np0005479823 nova_compute[235775]: 2025-10-10 10:14:48.826 2 DEBUG nova.compute.provider_tree [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Updating inventory in ProviderTree for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct 10 06:14:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:14:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:48.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:14:48 np0005479823 nova_compute[235775]: 2025-10-10 10:14:48.912 2 DEBUG nova.scheduler.client.report [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Updated inventory for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Oct 10 06:14:48 np0005479823 nova_compute[235775]: 2025-10-10 10:14:48.913 2 DEBUG nova.compute.provider_tree [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Updating resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Oct 10 06:14:48 np0005479823 nova_compute[235775]: 2025-10-10 10:14:48.913 2 DEBUG nova.compute.provider_tree [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Updating inventory in ProviderTree for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct 10 06:14:48 np0005479823 nova_compute[235775]: 2025-10-10 10:14:48.940 2 DEBUG oslo_concurrency.lockutils [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.245s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:14:48 np0005479823 nova_compute[235775]: 2025-10-10 10:14:48.966 2 INFO nova.scheduler.client.report [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Deleted allocations for instance f6ec6baf-a91e-4c7e-b1cf-b176d952068f#033[00m
Oct 10 06:14:49 np0005479823 nova_compute[235775]: 2025-10-10 10:14:49.035 2 DEBUG oslo_concurrency.lockutils [None req-fee111f3-21ef-420d-8ef5-3802cf5d21b3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "f6ec6baf-a91e-4c7e-b1cf-b176d952068f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:14:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:50.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:50 np0005479823 nova_compute[235775]: 2025-10-10 10:14:50.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:14:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:50 np0005479823 nova_compute[235775]: 2025-10-10 10:14:50.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:14:50 np0005479823 nova_compute[235775]: 2025-10-10 10:14:50.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:14:50 np0005479823 nova_compute[235775]: 2025-10-10 10:14:50.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct 10 06:14:50 np0005479823 nova_compute[235775]: 2025-10-10 10:14:50.816 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct 10 06:14:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:50.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:50 np0005479823 nova_compute[235775]: 2025-10-10 10:14:50.836 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct 10 06:14:50 np0005479823 nova_compute[235775]: 2025-10-10 10:14:50.836 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:14:50 np0005479823 nova_compute[235775]: 2025-10-10 10:14:50.837 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:14:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:51 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:14:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:51 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:14:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:51 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:14:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:14:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:52.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:14:52 np0005479823 nova_compute[235775]: 2025-10-10 10:14:52.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:52 np0005479823 nova_compute[235775]: 2025-10-10 10:14:52.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:14:52 np0005479823 nova_compute[235775]: 2025-10-10 10:14:52.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:14:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:52.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:52 np0005479823 nova_compute[235775]: 2025-10-10 10:14:52.843 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:14:52 np0005479823 nova_compute[235775]: 2025-10-10 10:14:52.843 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:14:52 np0005479823 nova_compute[235775]: 2025-10-10 10:14:52.844 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:14:52 np0005479823 nova_compute[235775]: 2025-10-10 10:14:52.844 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct 10 06:14:52 np0005479823 nova_compute[235775]: 2025-10-10 10:14:52.844 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:14:53 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:14:53 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1407840113' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:14:53 np0005479823 nova_compute[235775]: 2025-10-10 10:14:53.268 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:14:53 np0005479823 nova_compute[235775]: 2025-10-10 10:14:53.406 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct 10 06:14:53 np0005479823 nova_compute[235775]: 2025-10-10 10:14:53.407 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4880MB free_disk=59.94269943237305GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct 10 06:14:53 np0005479823 nova_compute[235775]: 2025-10-10 10:14:53.408 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:14:53 np0005479823 nova_compute[235775]: 2025-10-10 10:14:53.408 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:14:53 np0005479823 nova_compute[235775]: 2025-10-10 10:14:53.486 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct 10 06:14:53 np0005479823 nova_compute[235775]: 2025-10-10 10:14:53.486 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct 10 06:14:53 np0005479823 nova_compute[235775]: 2025-10-10 10:14:53.513 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:14:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:53 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:14:53.781 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:14:53 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:14:53 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1389124777' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:14:54 np0005479823 nova_compute[235775]: 2025-10-10 10:14:54.013 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:14:54 np0005479823 nova_compute[235775]: 2025-10-10 10:14:54.019 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct 10 06:14:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:54.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:54 np0005479823 nova_compute[235775]: 2025-10-10 10:14:54.205 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct 10 06:14:54 np0005479823 nova_compute[235775]: 2025-10-10 10:14:54.239 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct 10 06:14:54 np0005479823 nova_compute[235775]: 2025-10-10 10:14:54.239 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.831s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:14:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:54.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:55 np0005479823 nova_compute[235775]: 2025-10-10 10:14:55.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:55 np0005479823 nova_compute[235775]: 2025-10-10 10:14:55.237 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:14:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:14:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:55 np0005479823 nova_compute[235775]: 2025-10-10 10:14:55.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:14:55 np0005479823 nova_compute[235775]: 2025-10-10 10:14:55.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct 10 06:14:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:14:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:14:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:14:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:14:56 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:14:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:56.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:56 np0005479823 nova_compute[235775]: 2025-10-10 10:14:56.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:14:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:56.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:57 np0005479823 nova_compute[235775]: 2025-10-10 10:14:57.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:14:57 np0005479823 podman[242149]: 2025-10-10 10:14:57.780489031 +0000 UTC m=+0.059483914 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 10 06:14:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:57 np0005479823 podman[242151]: 2025-10-10 10:14:57.785452741 +0000 UTC m=+0.056360944 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, managed_by=edpm_ansible)
Oct 10 06:14:57 np0005479823 podman[242150]: 2025-10-10 10:14:57.805791215 +0000 UTC m=+0.081168922 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 10 06:14:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:14:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:14:58.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:14:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:14:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:14:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:14:58.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:14:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:14:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:14:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:14:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:00 np0005479823 nova_compute[235775]: 2025-10-10 10:15:00.057 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1760091285.055142, f6ec6baf-a91e-4c7e-b1cf-b176d952068f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct 10 06:15:00 np0005479823 nova_compute[235775]: 2025-10-10 10:15:00.057 2 INFO nova.compute.manager [-] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] VM Stopped (Lifecycle Event)#033[00m
Oct 10 06:15:00 np0005479823 nova_compute[235775]: 2025-10-10 10:15:00.085 2 DEBUG nova.compute.manager [None req-5d03117d-d595-4d75-bcd1-0a18ab46fbd2 - - - - - -] [instance: f6ec6baf-a91e-4c7e-b1cf-b176d952068f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct 10 06:15:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:00 np0005479823 nova_compute[235775]: 2025-10-10 10:15:00.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:15:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:15:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:00.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:15:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:15:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:00.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:01 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:15:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:01 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:15:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:01 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:15:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:01 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:15:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:02.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:02 np0005479823 nova_compute[235775]: 2025-10-10 10:15:02.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:15:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:15:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:02.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:15:02 np0005479823 nova_compute[235775]: 2025-10-10 10:15:02.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:15:03 np0005479823 nova_compute[235775]: 2025-10-10 10:15:03.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:15:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:04.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:04.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:05 np0005479823 nova_compute[235775]: 2025-10-10 10:15:05.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:15:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:15:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:06 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:15:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:06 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:15:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:06 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:15:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:06 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:15:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:15:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:06.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:15:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:15:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:06.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:15:07 np0005479823 nova_compute[235775]: 2025-10-10 10:15:07.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:15:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000065s ======
Oct 10 06:15:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:08.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000065s
Oct 10 06:15:08 np0005479823 podman[242247]: 2025-10-10 10:15:08.776769198 +0000 UTC m=+0.056973782 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 10 06:15:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:08.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:10 np0005479823 nova_compute[235775]: 2025-10-10 10:15:10.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:15:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:10.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:15:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:10.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:15:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:15:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:15:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:11 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:15:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:12.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:12 np0005479823 nova_compute[235775]: 2025-10-10 10:15:12.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:15:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:12.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:14.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:14.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:15 np0005479823 nova_compute[235775]: 2025-10-10 10:15:15.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:15:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:15:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:15:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:15:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:15:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:16 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:15:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:16.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:15:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:16.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:15:17 np0005479823 nova_compute[235775]: 2025-10-10 10:15:17.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:15:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:18.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:15:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:18.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:15:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:19 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 06:15:19 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.0 total, 600.0 interval#012Cumulative writes: 5489 writes, 28K keys, 5489 commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.04 MB/s#012Cumulative WAL: 5489 writes, 5489 syncs, 1.00 writes per sync, written: 0.07 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1548 writes, 7390 keys, 1548 commit groups, 1.0 writes per commit group, ingest: 16.91 MB, 0.03 MB/s#012Interval WAL: 1548 writes, 1548 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    134.5      0.32              0.12        15    0.021       0      0       0.0       0.0#012  L6      1/0   13.66 MB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   4.1    175.0    150.0      1.18              0.46        14    0.084     73K   7381       0.0       0.0#012 Sum      1/0   13.66 MB   0.0      0.2     0.0      0.2       0.2      0.1       0.0   5.1    137.5    146.7      1.50              0.58        29    0.052     73K   7381       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   6.9    158.0    161.0      0.47              0.22        10    0.047     30K   2558       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   0.0    175.0    150.0      1.18              0.46        14    0.084     73K   7381       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    135.3      0.32              0.12        14    0.023       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1800.0 total, 600.0 interval#012Flush(GB): cumulative 0.042, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.21 GB write, 0.12 MB/s write, 0.20 GB read, 0.11 MB/s read, 1.5 seconds#012Interval compaction: 0.07 GB write, 0.13 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x56161a963350#2 capacity: 304.00 MB usage: 17.45 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 0.000153 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(931,16.87 MB,5.5481%) FilterBlock(29,219.23 KB,0.0704263%) IndexBlock(29,378.61 KB,0.121624%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Oct 10 06:15:20 np0005479823 nova_compute[235775]: 2025-10-10 10:15:20.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:15:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:20.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:15:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:20.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:15:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:15:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:15:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:21 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:15:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:15:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:22.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:15:22 np0005479823 nova_compute[235775]: 2025-10-10 10:15:22.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:15:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:15:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:22.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:15:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:24.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:15:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:24.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:15:25 np0005479823 nova_compute[235775]: 2025-10-10 10:15:25.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:15:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:15:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:15:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:15:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:15:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:26 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:15:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:26.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:26.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:27 np0005479823 nova_compute[235775]: 2025-10-10 10:15:27.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:15:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:28.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:28 np0005479823 podman[242313]: 2025-10-10 10:15:28.8035987 +0000 UTC m=+0.068123213 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 10 06:15:28 np0005479823 podman[242312]: 2025-10-10 10:15:28.805964515 +0000 UTC m=+0.074795016 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 06:15:28 np0005479823 podman[242311]: 2025-10-10 10:15:28.826706252 +0000 UTC m=+0.090018495 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct 10 06:15:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:15:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:28.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:15:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:30 np0005479823 nova_compute[235775]: 2025-10-10 10:15:30.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:15:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:30.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:15:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:30.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:31 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:15:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:31 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:15:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:31 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:15:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:31 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:15:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:32.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:32 np0005479823 nova_compute[235775]: 2025-10-10 10:15:32.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:15:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:15:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:32.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:15:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:15:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:34.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:15:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:15:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:34.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:15:35 np0005479823 nova_compute[235775]: 2025-10-10 10:15:35.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:15:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:15:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Oct 10 06:15:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:15:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:15:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:15:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:36 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:15:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:36.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:36.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:37 np0005479823 nova_compute[235775]: 2025-10-10 10:15:37.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:15:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:38.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:38.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:38 np0005479823 podman[242413]: 2025-10-10 10:15:38.979043715 +0000 UTC m=+0.064802315 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 10 06:15:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:39 np0005479823 ovn_controller[132503]: 2025-10-10T10:15:39Z|00037|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Oct 10 06:15:40 np0005479823 nova_compute[235775]: 2025-10-10 10:15:40.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:15:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:40.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:15:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:40.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:15:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:15:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:15:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:41 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:15:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:15:41.469 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:15:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:15:41.469 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:15:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:15:41.469 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:15:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:42.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:42 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:15:42 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:15:42 np0005479823 nova_compute[235775]: 2025-10-10 10:15:42.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:15:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:15:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:42.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:15:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:44.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:44 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:15:44 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:15:44 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:15:44 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:15:44 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 06:15:44 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:15:44 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:15:44 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 06:15:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:44.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:45 np0005479823 nova_compute[235775]: 2025-10-10 10:15:45.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:15:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:15:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:15:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:46 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:15:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:46 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:15:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:46 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:15:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:15:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:46.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:15:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:15:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:46.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:15:47 np0005479823 nova_compute[235775]: 2025-10-10 10:15:47.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:15:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:48.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:48 np0005479823 nova_compute[235775]: 2025-10-10 10:15:48.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:15:48 np0005479823 nova_compute[235775]: 2025-10-10 10:15:48.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct 10 06:15:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:48.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:49 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 06:15:49 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.0 total, 600.0 interval#012Cumulative writes: 8276 writes, 33K keys, 8276 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s#012Cumulative WAL: 8276 writes, 2019 syncs, 4.10 writes per sync, written: 0.03 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2282 writes, 8748 keys, 2282 commit groups, 1.0 writes per commit group, ingest: 10.36 MB, 0.02 MB/s#012Interval WAL: 2282 writes, 922 syncs, 2.48 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Oct 10 06:15:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:49 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:15:49 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:15:50 np0005479823 nova_compute[235775]: 2025-10-10 10:15:50.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:15:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:50.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:15:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:50 np0005479823 nova_compute[235775]: 2025-10-10 10:15:50.832 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:15:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:15:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:50.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:15:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:15:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:15:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:15:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:51 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:15:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:51 np0005479823 nova_compute[235775]: 2025-10-10 10:15:51.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:15:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:52.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:15:52.482 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'da:dc:6a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '42:2f:dd:4e:d8:41'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct 10 06:15:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:15:52.482 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct 10 06:15:52 np0005479823 nova_compute[235775]: 2025-10-10 10:15:52.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:15:52 np0005479823 nova_compute[235775]: 2025-10-10 10:15:52.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:15:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:52 np0005479823 nova_compute[235775]: 2025-10-10 10:15:52.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:15:52 np0005479823 nova_compute[235775]: 2025-10-10 10:15:52.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct 10 06:15:52 np0005479823 nova_compute[235775]: 2025-10-10 10:15:52.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct 10 06:15:52 np0005479823 nova_compute[235775]: 2025-10-10 10:15:52.827 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct 10 06:15:52 np0005479823 nova_compute[235775]: 2025-10-10 10:15:52.828 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:15:52 np0005479823 nova_compute[235775]: 2025-10-10 10:15:52.828 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:15:52 np0005479823 nova_compute[235775]: 2025-10-10 10:15:52.828 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct 10 06:15:52 np0005479823 nova_compute[235775]: 2025-10-10 10:15:52.846 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct 10 06:15:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:52.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:53 np0005479823 nova_compute[235775]: 2025-10-10 10:15:53.833 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:15:53 np0005479823 nova_compute[235775]: 2025-10-10 10:15:53.859 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:15:53 np0005479823 nova_compute[235775]: 2025-10-10 10:15:53.859 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:15:53 np0005479823 nova_compute[235775]: 2025-10-10 10:15:53.859 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:15:53 np0005479823 nova_compute[235775]: 2025-10-10 10:15:53.859 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct 10 06:15:53 np0005479823 nova_compute[235775]: 2025-10-10 10:15:53.860 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:15:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:15:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:54.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:15:54 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:15:54 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2472353382' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:15:54 np0005479823 nova_compute[235775]: 2025-10-10 10:15:54.340 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:15:54 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:15:54.484 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:15:54 np0005479823 nova_compute[235775]: 2025-10-10 10:15:54.518 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct 10 06:15:54 np0005479823 nova_compute[235775]: 2025-10-10 10:15:54.519 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4913MB free_disk=59.9427490234375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct 10 06:15:54 np0005479823 nova_compute[235775]: 2025-10-10 10:15:54.519 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:15:54 np0005479823 nova_compute[235775]: 2025-10-10 10:15:54.520 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:15:54 np0005479823 nova_compute[235775]: 2025-10-10 10:15:54.675 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct 10 06:15:54 np0005479823 nova_compute[235775]: 2025-10-10 10:15:54.676 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct 10 06:15:54 np0005479823 nova_compute[235775]: 2025-10-10 10:15:54.730 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:15:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:54.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:55 np0005479823 nova_compute[235775]: 2025-10-10 10:15:55.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:15:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:15:55 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/375310620' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:15:55 np0005479823 nova_compute[235775]: 2025-10-10 10:15:55.166 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:15:55 np0005479823 nova_compute[235775]: 2025-10-10 10:15:55.171 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct 10 06:15:55 np0005479823 nova_compute[235775]: 2025-10-10 10:15:55.191 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct 10 06:15:55 np0005479823 nova_compute[235775]: 2025-10-10 10:15:55.192 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct 10 06:15:55 np0005479823 nova_compute[235775]: 2025-10-10 10:15:55.192 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.673s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:15:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:15:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:15:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:15:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:15:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:15:56 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:15:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:56.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:56 np0005479823 nova_compute[235775]: 2025-10-10 10:15:56.174 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:15:56 np0005479823 nova_compute[235775]: 2025-10-10 10:15:56.191 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:15:56 np0005479823 nova_compute[235775]: 2025-10-10 10:15:56.192 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:15:56 np0005479823 nova_compute[235775]: 2025-10-10 10:15:56.192 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:15:56 np0005479823 nova_compute[235775]: 2025-10-10 10:15:56.192 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct 10 06:15:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:56 np0005479823 nova_compute[235775]: 2025-10-10 10:15:56.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:15:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:15:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:56.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:15:57 np0005479823 nova_compute[235775]: 2025-10-10 10:15:57.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:15:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:57 np0005479823 nova_compute[235775]: 2025-10-10 10:15:57.829 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:15:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:15:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:15:58.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:15:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:15:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:15:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:15:58.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:15:59 np0005479823 podman[242701]: 2025-10-10 10:15:59.063254132 +0000 UTC m=+0.048131839 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct 10 06:15:59 np0005479823 podman[242699]: 2025-10-10 10:15:59.063237252 +0000 UTC m=+0.053282515 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd)
Oct 10 06:15:59 np0005479823 podman[242700]: 2025-10-10 10:15:59.087621446 +0000 UTC m=+0.075379666 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_controller)
Oct 10 06:15:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:15:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:15:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:15:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:00 np0005479823 nova_compute[235775]: 2025-10-10 10:16:00.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:16:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:00.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:16:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:00.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:01 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:16:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:01 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:16:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:01 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:16:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:01 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:16:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:02.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:02 np0005479823 nova_compute[235775]: 2025-10-10 10:16:02.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:16:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:16:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:02.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:16:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:16:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:04.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:16:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:04.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:05 np0005479823 nova_compute[235775]: 2025-10-10 10:16:05.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:16:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:16:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:16:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:16:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:16:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:06 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:16:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:06.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:06.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:07 np0005479823 nova_compute[235775]: 2025-10-10 10:16:07.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:16:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:16:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:08.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:16:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:16:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:08.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:16:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:09 np0005479823 podman[242773]: 2025-10-10 10:16:09.815246882 +0000 UTC m=+0.082780313 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 10 06:16:10 np0005479823 nova_compute[235775]: 2025-10-10 10:16:10.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:16:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:16:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:10.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:16:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:16:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:16:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:10.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:16:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:16:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:16:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:16:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:11 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:16:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:16:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:12.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:16:12 np0005479823 nova_compute[235775]: 2025-10-10 10:16:12.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:16:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:16:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:12.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:16:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:14.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:16:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:14.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:16:15 np0005479823 nova_compute[235775]: 2025-10-10 10:16:15.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:16:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:16:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:16:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:16:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:16:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:16 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:16:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:16.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:16:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:16.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:16:17 np0005479823 nova_compute[235775]: 2025-10-10 10:16:17.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:16:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:18.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:18.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:20 np0005479823 nova_compute[235775]: 2025-10-10 10:16:20.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:16:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:16:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:20.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:16:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:16:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:16:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:20.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:16:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:16:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:16:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:16:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:21 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:16:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:16:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:22.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:16:22 np0005479823 nova_compute[235775]: 2025-10-10 10:16:22.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:16:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:22.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:24.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:24 np0005479823 nova_compute[235775]: 2025-10-10 10:16:24.903 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:16:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:24.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:25 np0005479823 nova_compute[235775]: 2025-10-10 10:16:25.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:16:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:16:25 np0005479823 systemd[1]: virtsecretd.service: Deactivated successfully.
Oct 10 06:16:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:16:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:16:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:16:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:26 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:16:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:26.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:26.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:27 np0005479823 nova_compute[235775]: 2025-10-10 10:16:27.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:16:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:16:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:28.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:16:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:28.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:29 np0005479823 podman[242843]: 2025-10-10 10:16:29.814878687 +0000 UTC m=+0.087781344 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 10 06:16:29 np0005479823 podman[242844]: 2025-10-10 10:16:29.820780557 +0000 UTC m=+0.091114952 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Oct 10 06:16:29 np0005479823 podman[242842]: 2025-10-10 10:16:29.822662397 +0000 UTC m=+0.089900892 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 10 06:16:30 np0005479823 nova_compute[235775]: 2025-10-10 10:16:30.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:16:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:30.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:16:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:30.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:16:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:16:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:16:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:31 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:16:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:16:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:32.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:16:32 np0005479823 nova_compute[235775]: 2025-10-10 10:16:32.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:16:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:32.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:34.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:34.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:35 np0005479823 nova_compute[235775]: 2025-10-10 10:16:35.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:16:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:16:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:16:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:16:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:16:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:36 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:16:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:36.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:36.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:37 np0005479823 nova_compute[235775]: 2025-10-10 10:16:37.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:16:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:38.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:38.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:40 np0005479823 nova_compute[235775]: 2025-10-10 10:16:40.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:16:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:40.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:16:40 np0005479823 podman[242943]: 2025-10-10 10:16:40.764097471 +0000 UTC m=+0.043709988 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct 10 06:16:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:40.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:16:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:16:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:16:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:41 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:16:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:16:41.470 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:16:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:16:41.470 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:16:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:16:41.470 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:16:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:42.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:42 np0005479823 nova_compute[235775]: 2025-10-10 10:16:42.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:16:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:16:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:42.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:16:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:44.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:44.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:45 np0005479823 nova_compute[235775]: 2025-10-10 10:16:45.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:16:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:16:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:16:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:16:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:16:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:46 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:16:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:46.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:46.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:47 np0005479823 nova_compute[235775]: 2025-10-10 10:16:47.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:16:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:16:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:48.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:16:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:48.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:50 np0005479823 nova_compute[235775]: 2025-10-10 10:16:50.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:16:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:16:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:50.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:16:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:16:50 np0005479823 podman[243144]: 2025-10-10 10:16:50.545372971 +0000 UTC m=+0.051069324 container create 318485f3a8d84ac65abf731e7fedbe880bb7635bc73e57d3cf19c0033f333c33 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_bell, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2)
Oct 10 06:16:50 np0005479823 systemd[1]: Started libpod-conmon-318485f3a8d84ac65abf731e7fedbe880bb7635bc73e57d3cf19c0033f333c33.scope.
Oct 10 06:16:50 np0005479823 podman[243144]: 2025-10-10 10:16:50.521348158 +0000 UTC m=+0.027044541 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 06:16:50 np0005479823 systemd[1]: Started libcrun container.
Oct 10 06:16:50 np0005479823 podman[243144]: 2025-10-10 10:16:50.644960004 +0000 UTC m=+0.150656367 container init 318485f3a8d84ac65abf731e7fedbe880bb7635bc73e57d3cf19c0033f333c33 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_bell, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Oct 10 06:16:50 np0005479823 podman[243144]: 2025-10-10 10:16:50.654352586 +0000 UTC m=+0.160048969 container start 318485f3a8d84ac65abf731e7fedbe880bb7635bc73e57d3cf19c0033f333c33 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_bell, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.vendor=CentOS)
Oct 10 06:16:50 np0005479823 podman[243144]: 2025-10-10 10:16:50.658702306 +0000 UTC m=+0.164398689 container attach 318485f3a8d84ac65abf731e7fedbe880bb7635bc73e57d3cf19c0033f333c33 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_bell, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True)
Oct 10 06:16:50 np0005479823 systemd[1]: libpod-318485f3a8d84ac65abf731e7fedbe880bb7635bc73e57d3cf19c0033f333c33.scope: Deactivated successfully.
Oct 10 06:16:50 np0005479823 strange_bell[243162]: 167 167
Oct 10 06:16:50 np0005479823 conmon[243162]: conmon 318485f3a8d84ac65abf <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-318485f3a8d84ac65abf731e7fedbe880bb7635bc73e57d3cf19c0033f333c33.scope/container/memory.events
Oct 10 06:16:50 np0005479823 podman[243144]: 2025-10-10 10:16:50.661901039 +0000 UTC m=+0.167597392 container died 318485f3a8d84ac65abf731e7fedbe880bb7635bc73e57d3cf19c0033f333c33 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_bell, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 06:16:50 np0005479823 systemd[1]: var-lib-containers-storage-overlay-713019a75dc8b0996e1ae5efb45190fbb56ebf3b4a978f04bdc48e490de56cc9-merged.mount: Deactivated successfully.
Oct 10 06:16:50 np0005479823 podman[243144]: 2025-10-10 10:16:50.698793295 +0000 UTC m=+0.204489678 container remove 318485f3a8d84ac65abf731e7fedbe880bb7635bc73e57d3cf19c0033f333c33 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=strange_bell, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1)
Oct 10 06:16:50 np0005479823 systemd[1]: libpod-conmon-318485f3a8d84ac65abf731e7fedbe880bb7635bc73e57d3cf19c0033f333c33.scope: Deactivated successfully.
Oct 10 06:16:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:50 np0005479823 nova_compute[235775]: 2025-10-10 10:16:50.831 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:16:50 np0005479823 podman[243185]: 2025-10-10 10:16:50.845008208 +0000 UTC m=+0.037745185 container create e710ce9fc70a0fc4e41972136ccac1a33830a938f488b24a558973a88c9e7009 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_brown, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, ceph=True)
Oct 10 06:16:50 np0005479823 systemd[1]: Started libpod-conmon-e710ce9fc70a0fc4e41972136ccac1a33830a938f488b24a558973a88c9e7009.scope.
Oct 10 06:16:50 np0005479823 systemd[1]: Started libcrun container.
Oct 10 06:16:50 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43c46d437726f5d70ba260116ae7b0fa0ec696361fa9ee7992283f204eb9a249/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Oct 10 06:16:50 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43c46d437726f5d70ba260116ae7b0fa0ec696361fa9ee7992283f204eb9a249/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Oct 10 06:16:50 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43c46d437726f5d70ba260116ae7b0fa0ec696361fa9ee7992283f204eb9a249/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Oct 10 06:16:50 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43c46d437726f5d70ba260116ae7b0fa0ec696361fa9ee7992283f204eb9a249/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Oct 10 06:16:50 np0005479823 podman[243185]: 2025-10-10 10:16:50.925060673 +0000 UTC m=+0.117797670 container init e710ce9fc70a0fc4e41972136ccac1a33830a938f488b24a558973a88c9e7009 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_brown, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Oct 10 06:16:50 np0005479823 podman[243185]: 2025-10-10 10:16:50.829271682 +0000 UTC m=+0.022008669 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Oct 10 06:16:50 np0005479823 podman[243185]: 2025-10-10 10:16:50.932815352 +0000 UTC m=+0.125552339 container start e710ce9fc70a0fc4e41972136ccac1a33830a938f488b24a558973a88c9e7009 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_brown, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 06:16:50 np0005479823 podman[243185]: 2025-10-10 10:16:50.936440049 +0000 UTC m=+0.129177046 container attach e710ce9fc70a0fc4e41972136ccac1a33830a938f488b24a558973a88c9e7009 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_brown, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Oct 10 06:16:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:50.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:16:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:16:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:16:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:51 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:16:51 np0005479823 pensive_brown[243202]: [
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:    {
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:        "available": false,
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:        "being_replaced": false,
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:        "ceph_device_lvm": false,
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:        "device_id": "QEMU_DVD-ROM_QM00001",
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:        "lsm_data": {},
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:        "lvs": [],
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:        "path": "/dev/sr0",
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:        "rejected_reasons": [
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:            "Insufficient space (<5GB)",
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:            "Has a FileSystem"
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:        ],
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:        "sys_api": {
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:            "actuators": null,
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:            "device_nodes": [
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:                "sr0"
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:            ],
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:            "devname": "sr0",
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:            "human_readable_size": "482.00 KB",
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:            "id_bus": "ata",
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:            "model": "QEMU DVD-ROM",
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:            "nr_requests": "2",
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:            "parent": "/dev/sr0",
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:            "partitions": {},
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:            "path": "/dev/sr0",
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:            "removable": "1",
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:            "rev": "2.5+",
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:            "ro": "0",
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:            "rotational": "0",
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:            "sas_address": "",
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:            "sas_device_handle": "",
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:            "scheduler_mode": "mq-deadline",
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:            "sectors": 0,
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:            "sectorsize": "2048",
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:            "size": 493568.0,
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:            "support_discard": "2048",
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:            "type": "disk",
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:            "vendor": "QEMU"
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:        }
Oct 10 06:16:51 np0005479823 pensive_brown[243202]:    }
Oct 10 06:16:51 np0005479823 pensive_brown[243202]: ]
Oct 10 06:16:51 np0005479823 systemd[1]: libpod-e710ce9fc70a0fc4e41972136ccac1a33830a938f488b24a558973a88c9e7009.scope: Deactivated successfully.
Oct 10 06:16:51 np0005479823 conmon[243202]: conmon e710ce9fc70a0fc4e419 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-e710ce9fc70a0fc4e41972136ccac1a33830a938f488b24a558973a88c9e7009.scope/container/memory.events
Oct 10 06:16:51 np0005479823 podman[243185]: 2025-10-10 10:16:51.698137668 +0000 UTC m=+0.890874675 container died e710ce9fc70a0fc4e41972136ccac1a33830a938f488b24a558973a88c9e7009 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_brown, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_REF=squid)
Oct 10 06:16:51 np0005479823 systemd[1]: var-lib-containers-storage-overlay-43c46d437726f5d70ba260116ae7b0fa0ec696361fa9ee7992283f204eb9a249-merged.mount: Deactivated successfully.
Oct 10 06:16:51 np0005479823 podman[243185]: 2025-10-10 10:16:51.738725363 +0000 UTC m=+0.931462340 container remove e710ce9fc70a0fc4e41972136ccac1a33830a938f488b24a558973a88c9e7009 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_brown, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0)
Oct 10 06:16:51 np0005479823 systemd[1]: libpod-conmon-e710ce9fc70a0fc4e41972136ccac1a33830a938f488b24a558973a88c9e7009.scope: Deactivated successfully.
Oct 10 06:16:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:52.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:52 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:16:52 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:16:52 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 06:16:52 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:16:52 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:16:52 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 06:16:52 np0005479823 nova_compute[235775]: 2025-10-10 10:16:52.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:16:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:52 np0005479823 nova_compute[235775]: 2025-10-10 10:16:52.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:16:52 np0005479823 nova_compute[235775]: 2025-10-10 10:16:52.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:16:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:16:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:52.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:16:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:53 np0005479823 nova_compute[235775]: 2025-10-10 10:16:53.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:16:53 np0005479823 nova_compute[235775]: 2025-10-10 10:16:53.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct 10 06:16:53 np0005479823 nova_compute[235775]: 2025-10-10 10:16:53.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct 10 06:16:53 np0005479823 nova_compute[235775]: 2025-10-10 10:16:53.830 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct 10 06:16:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:16:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:54.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:16:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:54 np0005479823 nova_compute[235775]: 2025-10-10 10:16:54.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:16:54 np0005479823 nova_compute[235775]: 2025-10-10 10:16:54.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:16:54 np0005479823 nova_compute[235775]: 2025-10-10 10:16:54.836 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:16:54 np0005479823 nova_compute[235775]: 2025-10-10 10:16:54.837 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:16:54 np0005479823 nova_compute[235775]: 2025-10-10 10:16:54.837 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:16:54 np0005479823 nova_compute[235775]: 2025-10-10 10:16:54.838 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct 10 06:16:54 np0005479823 nova_compute[235775]: 2025-10-10 10:16:54.838 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:16:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:54.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:55 np0005479823 nova_compute[235775]: 2025-10-10 10:16:55.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:16:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:16:55 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3908480547' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:16:55 np0005479823 nova_compute[235775]: 2025-10-10 10:16:55.285 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:16:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:16:55 np0005479823 nova_compute[235775]: 2025-10-10 10:16:55.467 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct 10 06:16:55 np0005479823 nova_compute[235775]: 2025-10-10 10:16:55.468 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4861MB free_disk=59.89700698852539GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct 10 06:16:55 np0005479823 nova_compute[235775]: 2025-10-10 10:16:55.468 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:16:55 np0005479823 nova_compute[235775]: 2025-10-10 10:16:55.469 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:16:55 np0005479823 nova_compute[235775]: 2025-10-10 10:16:55.543 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct 10 06:16:55 np0005479823 nova_compute[235775]: 2025-10-10 10:16:55.544 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct 10 06:16:55 np0005479823 nova_compute[235775]: 2025-10-10 10:16:55.595 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:16:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:16:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:16:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:16:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:16:56 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:16:56 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:16:56 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2955662946' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:16:56 np0005479823 nova_compute[235775]: 2025-10-10 10:16:56.024 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:16:56 np0005479823 nova_compute[235775]: 2025-10-10 10:16:56.029 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct 10 06:16:56 np0005479823 nova_compute[235775]: 2025-10-10 10:16:56.048 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct 10 06:16:56 np0005479823 nova_compute[235775]: 2025-10-10 10:16:56.050 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct 10 06:16:56 np0005479823 nova_compute[235775]: 2025-10-10 10:16:56.050 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.582s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:16:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:56.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:56.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:57 np0005479823 nova_compute[235775]: 2025-10-10 10:16:57.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:16:57 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:16:57 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:16:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:58 np0005479823 nova_compute[235775]: 2025-10-10 10:16:58.051 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:16:58 np0005479823 nova_compute[235775]: 2025-10-10 10:16:58.052 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:16:58 np0005479823 nova_compute[235775]: 2025-10-10 10:16:58.052 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:16:58 np0005479823 nova_compute[235775]: 2025-10-10 10:16:58.052 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct 10 06:16:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:16:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:16:58.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:16:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:16:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:16:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:16:58.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:16:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:16:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:16:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:16:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:00 np0005479823 nova_compute[235775]: 2025-10-10 10:17:00.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:17:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:17:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:00.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:17:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:17:00 np0005479823 podman[244493]: 2025-10-10 10:17:00.801010638 +0000 UTC m=+0.065507417 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct 10 06:17:00 np0005479823 podman[244495]: 2025-10-10 10:17:00.802100944 +0000 UTC m=+0.062138460 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, tcib_managed=true)
Oct 10 06:17:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:00 np0005479823 podman[244494]: 2025-10-10 10:17:00.843849626 +0000 UTC m=+0.108359155 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 10 06:17:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:00.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:17:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:17:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:17:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:01 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:17:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:02.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:02 np0005479823 nova_compute[235775]: 2025-10-10 10:17:02.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:17:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:02.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:04 np0005479823 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Oct 10 06:17:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:17:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:04.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.392683) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091424392735, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 1683, "num_deletes": 257, "total_data_size": 4233461, "memory_usage": 4297424, "flush_reason": "Manual Compaction"}
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091424410888, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 2743362, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28472, "largest_seqno": 30149, "table_properties": {"data_size": 2736402, "index_size": 3967, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1925, "raw_key_size": 14641, "raw_average_key_size": 19, "raw_value_size": 2722384, "raw_average_value_size": 3634, "num_data_blocks": 174, "num_entries": 749, "num_filter_entries": 749, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760091287, "oldest_key_time": 1760091287, "file_creation_time": 1760091424, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 18270 microseconds, and 10013 cpu microseconds.
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.410949) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 2743362 bytes OK
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.410978) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.412216) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.412238) EVENT_LOG_v1 {"time_micros": 1760091424412231, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.412260) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 4225731, prev total WAL file size 4225731, number of live WAL files 2.
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.414168) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00353032' seq:72057594037927935, type:22 .. '6C6F676D00373535' seq:0, type:0; will stop at (end)
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(2679KB)], [54(13MB)]
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091424414231, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 17069281, "oldest_snapshot_seqno": -1}
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 6030 keys, 16925177 bytes, temperature: kUnknown
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091424505979, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 16925177, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16881925, "index_size": 27078, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15109, "raw_key_size": 153413, "raw_average_key_size": 25, "raw_value_size": 16770315, "raw_average_value_size": 2781, "num_data_blocks": 1111, "num_entries": 6030, "num_filter_entries": 6030, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760091424, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.506376) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 16925177 bytes
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.508474) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 185.7 rd, 184.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 13.7 +0.0 blob) out(16.1 +0.0 blob), read-write-amplify(12.4) write-amplify(6.2) OK, records in: 6562, records dropped: 532 output_compression: NoCompression
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.508489) EVENT_LOG_v1 {"time_micros": 1760091424508482, "job": 32, "event": "compaction_finished", "compaction_time_micros": 91901, "compaction_time_cpu_micros": 52781, "output_level": 6, "num_output_files": 1, "total_output_size": 16925177, "num_input_records": 6562, "num_output_records": 6030, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091424509159, "job": 32, "event": "table_file_deletion", "file_number": 56}
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091424511566, "job": 32, "event": "table_file_deletion", "file_number": 54}
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.413897) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.511653) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.511658) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.511659) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.511661) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:17:04 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:17:04.511663) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:17:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:04.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:05 np0005479823 nova_compute[235775]: 2025-10-10 10:17:05.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:17:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:17:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:17:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:17:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:17:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:06 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:17:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:06.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:06.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:07 np0005479823 nova_compute[235775]: 2025-10-10 10:17:07.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:17:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:08.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:08.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:10 np0005479823 nova_compute[235775]: 2025-10-10 10:17:10.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:17:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:10.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:17:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:10.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:17:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:17:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:17:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:11 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:17:11 np0005479823 podman[244571]: 2025-10-10 10:17:11.798788044 +0000 UTC m=+0.066211480 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct 10 06:17:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:12.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:12 np0005479823 nova_compute[235775]: 2025-10-10 10:17:12.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:17:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:13.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:13 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:17:13.207 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'da:dc:6a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '42:2f:dd:4e:d8:41'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct 10 06:17:13 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:17:13.208 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct 10 06:17:13 np0005479823 nova_compute[235775]: 2025-10-10 10:17:13.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:17:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:17:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:14.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:17:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:15.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:15 np0005479823 nova_compute[235775]: 2025-10-10 10:17:15.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:17:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:17:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:17:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:17:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:17:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:16 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:17:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:16.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:17:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:17.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:17:17 np0005479823 nova_compute[235775]: 2025-10-10 10:17:17.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:17:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:17:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:18.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:17:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:17:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:19.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:17:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:20 np0005479823 nova_compute[235775]: 2025-10-10 10:17:20.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:17:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:17:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:20.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:17:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:17:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:17:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:17:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:17:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:21 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:17:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:17:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:21.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:17:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:22 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:17:22.210 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:17:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:17:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:22.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:17:22 np0005479823 nova_compute[235775]: 2025-10-10 10:17:22.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:17:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:23.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:17:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:24.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:17:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:25.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:25 np0005479823 nova_compute[235775]: 2025-10-10 10:17:25.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:17:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:17:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:17:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:17:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:17:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:26 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:17:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:17:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:26.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:17:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Oct 10 06:17:26 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3051023765' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 06:17:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Oct 10 06:17:26 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3051023765' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 06:17:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:27.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:27 np0005479823 nova_compute[235775]: 2025-10-10 10:17:27.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:17:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:28.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:29.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:30 np0005479823 nova_compute[235775]: 2025-10-10 10:17:30.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:17:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:30.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:17:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:17:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:17:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:17:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:31 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:17:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:31.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:31 np0005479823 podman[244637]: 2025-10-10 10:17:31.814513898 +0000 UTC m=+0.068107612 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 10 06:17:31 np0005479823 podman[244635]: 2025-10-10 10:17:31.814493857 +0000 UTC m=+0.078170256 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct 10 06:17:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:31 np0005479823 podman[244636]: 2025-10-10 10:17:31.85374623 +0000 UTC m=+0.111379774 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct 10 06:17:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:17:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:32.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:17:32 np0005479823 nova_compute[235775]: 2025-10-10 10:17:32.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:17:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:33.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:17:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:34.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:17:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:35.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:35 np0005479823 nova_compute[235775]: 2025-10-10 10:17:35.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:17:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:17:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:17:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:17:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:17:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:36 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:17:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:36.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:37.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:37 np0005479823 nova_compute[235775]: 2025-10-10 10:17:37.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:17:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:38.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:39.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:40 np0005479823 nova_compute[235775]: 2025-10-10 10:17:40.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:17:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:40.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:17:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:17:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:17:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:17:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:41 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:17:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:41.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:17:41.471 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:17:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:17:41.472 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:17:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:17:41.472 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:17:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:42.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:42 np0005479823 podman[244733]: 2025-10-10 10:17:42.768615829 +0000 UTC m=+0.050320351 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 06:17:42 np0005479823 nova_compute[235775]: 2025-10-10 10:17:42.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:17:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:17:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:43.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:17:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:44.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:45.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:45 np0005479823 nova_compute[235775]: 2025-10-10 10:17:45.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:17:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:17:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:17:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:17:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:17:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:46 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:17:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:46.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:47.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:47 np0005479823 nova_compute[235775]: 2025-10-10 10:17:47.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:17:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:17:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:48.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:17:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:49.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:50 np0005479823 nova_compute[235775]: 2025-10-10 10:17:50.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:17:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:50.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:17:50 np0005479823 nova_compute[235775]: 2025-10-10 10:17:50.810 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:17:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:17:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:17:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:17:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:51 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:17:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:17:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:51.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:17:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:52.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:52 np0005479823 nova_compute[235775]: 2025-10-10 10:17:52.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:17:52 np0005479823 nova_compute[235775]: 2025-10-10 10:17:52.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:17:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:53.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:53 np0005479823 nova_compute[235775]: 2025-10-10 10:17:53.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:17:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:54.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:54 np0005479823 nova_compute[235775]: 2025-10-10 10:17:54.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:17:54 np0005479823 nova_compute[235775]: 2025-10-10 10:17:54.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct 10 06:17:54 np0005479823 nova_compute[235775]: 2025-10-10 10:17:54.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct 10 06:17:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:54 np0005479823 nova_compute[235775]: 2025-10-10 10:17:54.853 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct 10 06:17:54 np0005479823 nova_compute[235775]: 2025-10-10 10:17:54.855 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:17:54 np0005479823 nova_compute[235775]: 2025-10-10 10:17:54.896 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:17:54 np0005479823 nova_compute[235775]: 2025-10-10 10:17:54.897 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:17:54 np0005479823 nova_compute[235775]: 2025-10-10 10:17:54.897 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:17:54 np0005479823 nova_compute[235775]: 2025-10-10 10:17:54.897 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct 10 06:17:54 np0005479823 nova_compute[235775]: 2025-10-10 10:17:54.897 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:17:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:55.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:55 np0005479823 nova_compute[235775]: 2025-10-10 10:17:55.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:17:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:17:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:17:55 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1465875043' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:17:55 np0005479823 nova_compute[235775]: 2025-10-10 10:17:55.425 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.527s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:17:55 np0005479823 nova_compute[235775]: 2025-10-10 10:17:55.560 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct 10 06:17:55 np0005479823 nova_compute[235775]: 2025-10-10 10:17:55.561 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4948MB free_disk=59.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct 10 06:17:55 np0005479823 nova_compute[235775]: 2025-10-10 10:17:55.561 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:17:55 np0005479823 nova_compute[235775]: 2025-10-10 10:17:55.561 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:17:55 np0005479823 nova_compute[235775]: 2025-10-10 10:17:55.624 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct 10 06:17:55 np0005479823 nova_compute[235775]: 2025-10-10 10:17:55.625 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct 10 06:17:55 np0005479823 nova_compute[235775]: 2025-10-10 10:17:55.644 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:17:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:17:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:17:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:17:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:17:56 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:17:56 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:17:56 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3077656643' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:17:56 np0005479823 nova_compute[235775]: 2025-10-10 10:17:56.088 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:17:56 np0005479823 nova_compute[235775]: 2025-10-10 10:17:56.096 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct 10 06:17:56 np0005479823 nova_compute[235775]: 2025-10-10 10:17:56.116 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct 10 06:17:56 np0005479823 nova_compute[235775]: 2025-10-10 10:17:56.118 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct 10 06:17:56 np0005479823 nova_compute[235775]: 2025-10-10 10:17:56.119 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.557s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:17:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:56.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:17:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:57.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:17:57 np0005479823 nova_compute[235775]: 2025-10-10 10:17:57.078 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:17:57 np0005479823 nova_compute[235775]: 2025-10-10 10:17:57.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:17:57 np0005479823 nova_compute[235775]: 2025-10-10 10:17:57.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:17:57 np0005479823 nova_compute[235775]: 2025-10-10 10:17:57.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:17:57 np0005479823 nova_compute[235775]: 2025-10-10 10:17:57.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct 10 06:17:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:58 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 06:17:58 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:17:58 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:17:58 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 06:17:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:17:58.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:17:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:17:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:17:59.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:17:59 np0005479823 nova_compute[235775]: 2025-10-10 10:17:59.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:17:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:17:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:17:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:17:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:00 np0005479823 nova_compute[235775]: 2025-10-10 10:18:00.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.003000098s ======
Oct 10 06:18:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:00.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000098s
Oct 10 06:18:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:18:00 np0005479823 nova_compute[235775]: 2025-10-10 10:18:00.811 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:18:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:18:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:18:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:18:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:01 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:18:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:18:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:01.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:18:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:18:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:02.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:18:02 np0005479823 podman[244922]: 2025-10-10 10:18:02.79860728 +0000 UTC m=+0.064489365 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3)
Oct 10 06:18:02 np0005479823 nova_compute[235775]: 2025-10-10 10:18:02.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:02 np0005479823 podman[244924]: 2025-10-10 10:18:02.816115704 +0000 UTC m=+0.066094748 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid)
Oct 10 06:18:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:02 np0005479823 podman[244923]: 2025-10-10 10:18:02.838154293 +0000 UTC m=+0.096849416 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true)
Oct 10 06:18:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:03.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:03 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:18:03 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:18:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:18:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:04.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:18:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:18:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:05.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:18:05 np0005479823 nova_compute[235775]: 2025-10-10 10:18:05.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:18:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:18:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:18:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:18:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:06 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:18:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:06.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:07.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:07 np0005479823 nova_compute[235775]: 2025-10-10 10:18:07.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:08 np0005479823 nova_compute[235775]: 2025-10-10 10:18:08.115 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:18:08 np0005479823 nova_compute[235775]: 2025-10-10 10:18:08.116 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:18:08 np0005479823 nova_compute[235775]: 2025-10-10 10:18:08.140 2 DEBUG nova.compute.manager [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct 10 06:18:08 np0005479823 nova_compute[235775]: 2025-10-10 10:18:08.304 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:18:08 np0005479823 nova_compute[235775]: 2025-10-10 10:18:08.305 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:18:08 np0005479823 nova_compute[235775]: 2025-10-10 10:18:08.312 2 DEBUG nova.virt.hardware [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct 10 06:18:08 np0005479823 nova_compute[235775]: 2025-10-10 10:18:08.312 2 INFO nova.compute.claims [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct 10 06:18:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:08.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:08 np0005479823 nova_compute[235775]: 2025-10-10 10:18:08.436 2 DEBUG oslo_concurrency.processutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:18:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:08 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:18:08 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2863660861' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:18:08 np0005479823 nova_compute[235775]: 2025-10-10 10:18:08.875 2 DEBUG oslo_concurrency.processutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:18:08 np0005479823 nova_compute[235775]: 2025-10-10 10:18:08.880 2 DEBUG nova.compute.provider_tree [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct 10 06:18:08 np0005479823 nova_compute[235775]: 2025-10-10 10:18:08.899 2 DEBUG nova.scheduler.client.report [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct 10 06:18:08 np0005479823 nova_compute[235775]: 2025-10-10 10:18:08.925 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.620s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:18:08 np0005479823 nova_compute[235775]: 2025-10-10 10:18:08.925 2 DEBUG nova.compute.manager [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct 10 06:18:08 np0005479823 nova_compute[235775]: 2025-10-10 10:18:08.984 2 DEBUG nova.compute.manager [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct 10 06:18:08 np0005479823 nova_compute[235775]: 2025-10-10 10:18:08.984 2 DEBUG nova.network.neutron [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct 10 06:18:09 np0005479823 nova_compute[235775]: 2025-10-10 10:18:09.022 2 INFO nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct 10 06:18:09 np0005479823 nova_compute[235775]: 2025-10-10 10:18:09.043 2 DEBUG nova.compute.manager [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct 10 06:18:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:18:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:09.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:18:09 np0005479823 nova_compute[235775]: 2025-10-10 10:18:09.187 2 DEBUG nova.compute.manager [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct 10 06:18:09 np0005479823 nova_compute[235775]: 2025-10-10 10:18:09.189 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct 10 06:18:09 np0005479823 nova_compute[235775]: 2025-10-10 10:18:09.189 2 INFO nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Creating image(s)#033[00m
Oct 10 06:18:09 np0005479823 nova_compute[235775]: 2025-10-10 10:18:09.220 2 DEBUG nova.storage.rbd_utils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image 49a68fc9-f469-4827-9bb8-f2c2981d2b68_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct 10 06:18:09 np0005479823 nova_compute[235775]: 2025-10-10 10:18:09.250 2 DEBUG nova.storage.rbd_utils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image 49a68fc9-f469-4827-9bb8-f2c2981d2b68_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct 10 06:18:09 np0005479823 nova_compute[235775]: 2025-10-10 10:18:09.277 2 DEBUG nova.storage.rbd_utils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image 49a68fc9-f469-4827-9bb8-f2c2981d2b68_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct 10 06:18:09 np0005479823 nova_compute[235775]: 2025-10-10 10:18:09.280 2 DEBUG oslo_concurrency.processutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:18:09 np0005479823 nova_compute[235775]: 2025-10-10 10:18:09.351 2 DEBUG oslo_concurrency.processutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:18:09 np0005479823 nova_compute[235775]: 2025-10-10 10:18:09.352 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "eec5fe2328f977d3b1a385313e521aef425c0ac1" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:18:09 np0005479823 nova_compute[235775]: 2025-10-10 10:18:09.353 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "eec5fe2328f977d3b1a385313e521aef425c0ac1" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:18:09 np0005479823 nova_compute[235775]: 2025-10-10 10:18:09.353 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "eec5fe2328f977d3b1a385313e521aef425c0ac1" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:18:09 np0005479823 nova_compute[235775]: 2025-10-10 10:18:09.375 2 DEBUG nova.storage.rbd_utils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image 49a68fc9-f469-4827-9bb8-f2c2981d2b68_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct 10 06:18:09 np0005479823 nova_compute[235775]: 2025-10-10 10:18:09.378 2 DEBUG oslo_concurrency.processutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1 49a68fc9-f469-4827-9bb8-f2c2981d2b68_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:18:09 np0005479823 nova_compute[235775]: 2025-10-10 10:18:09.596 2 DEBUG oslo_concurrency.processutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1 49a68fc9-f469-4827-9bb8-f2c2981d2b68_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.218s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:18:09 np0005479823 nova_compute[235775]: 2025-10-10 10:18:09.670 2 DEBUG nova.storage.rbd_utils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] resizing rbd image 49a68fc9-f469-4827-9bb8-f2c2981d2b68_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct 10 06:18:09 np0005479823 nova_compute[235775]: 2025-10-10 10:18:09.824 2 DEBUG nova.objects.instance [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lazy-loading 'migration_context' on Instance uuid 49a68fc9-f469-4827-9bb8-f2c2981d2b68 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct 10 06:18:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:09 np0005479823 nova_compute[235775]: 2025-10-10 10:18:09.845 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct 10 06:18:09 np0005479823 nova_compute[235775]: 2025-10-10 10:18:09.845 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Ensure instance console log exists: /var/lib/nova/instances/49a68fc9-f469-4827-9bb8-f2c2981d2b68/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct 10 06:18:09 np0005479823 nova_compute[235775]: 2025-10-10 10:18:09.846 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:18:09 np0005479823 nova_compute[235775]: 2025-10-10 10:18:09.846 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:18:09 np0005479823 nova_compute[235775]: 2025-10-10 10:18:09.847 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:18:10 np0005479823 nova_compute[235775]: 2025-10-10 10:18:10.054 2 DEBUG nova.policy [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7956778c03764aaf8906c9b435337976', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd5e531d4b440422d946eaf6fd4e166f7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct 10 06:18:10 np0005479823 nova_compute[235775]: 2025-10-10 10:18:10.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:18:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:18:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:10.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:18:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:18:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:18:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:18:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:11 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:18:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:11.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.166559) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091491167119, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 1184, "num_deletes": 501, "total_data_size": 1952418, "memory_usage": 1978488, "flush_reason": "Manual Compaction"}
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091491174715, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 897365, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 30154, "largest_seqno": 31333, "table_properties": {"data_size": 893126, "index_size": 1379, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 13960, "raw_average_key_size": 19, "raw_value_size": 882191, "raw_average_value_size": 1232, "num_data_blocks": 61, "num_entries": 716, "num_filter_entries": 716, "num_deletions": 501, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760091425, "oldest_key_time": 1760091425, "file_creation_time": 1760091491, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 8203 microseconds, and 3455 cpu microseconds.
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.174768) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 897365 bytes OK
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.174791) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.177100) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.177124) EVENT_LOG_v1 {"time_micros": 1760091491177117, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.177145) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 1945717, prev total WAL file size 1945717, number of live WAL files 2.
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.178146) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032323539' seq:72057594037927935, type:22 .. '7061786F730032353131' seq:0, type:0; will stop at (end)
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(876KB)], [57(16MB)]
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091491178179, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 17822542, "oldest_snapshot_seqno": -1}
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 5753 keys, 12048656 bytes, temperature: kUnknown
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091491245991, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 12048656, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12012624, "index_size": 20562, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14405, "raw_key_size": 148782, "raw_average_key_size": 25, "raw_value_size": 11911109, "raw_average_value_size": 2070, "num_data_blocks": 824, "num_entries": 5753, "num_filter_entries": 5753, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760091491, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.246220) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 12048656 bytes
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.247957) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 262.6 rd, 177.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 16.1 +0.0 blob) out(11.5 +0.0 blob), read-write-amplify(33.3) write-amplify(13.4) OK, records in: 6746, records dropped: 993 output_compression: NoCompression
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.247974) EVENT_LOG_v1 {"time_micros": 1760091491247966, "job": 34, "event": "compaction_finished", "compaction_time_micros": 67874, "compaction_time_cpu_micros": 35310, "output_level": 6, "num_output_files": 1, "total_output_size": 12048656, "num_input_records": 6746, "num_output_records": 5753, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091491248239, "job": 34, "event": "table_file_deletion", "file_number": 59}
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091491251515, "job": 34, "event": "table_file_deletion", "file_number": 57}
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.178067) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.251564) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.251569) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.251571) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.251573) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:18:11 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:18:11.251575) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:18:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:11 np0005479823 nova_compute[235775]: 2025-10-10 10:18:11.860 2 DEBUG nova.network.neutron [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Successfully updated port: 864e1646-5abd-4268-a80a-c224425c842d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct 10 06:18:11 np0005479823 nova_compute[235775]: 2025-10-10 10:18:11.874 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "refresh_cache-49a68fc9-f469-4827-9bb8-f2c2981d2b68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct 10 06:18:11 np0005479823 nova_compute[235775]: 2025-10-10 10:18:11.874 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquired lock "refresh_cache-49a68fc9-f469-4827-9bb8-f2c2981d2b68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct 10 06:18:11 np0005479823 nova_compute[235775]: 2025-10-10 10:18:11.874 2 DEBUG nova.network.neutron [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct 10 06:18:11 np0005479823 nova_compute[235775]: 2025-10-10 10:18:11.949 2 DEBUG nova.compute.manager [req-e18b1dab-6534-4331-96b1-a6ab7fb69ead req-c947b94f-7989-4b20-9960-742d02460bfc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Received event network-changed-864e1646-5abd-4268-a80a-c224425c842d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct 10 06:18:11 np0005479823 nova_compute[235775]: 2025-10-10 10:18:11.950 2 DEBUG nova.compute.manager [req-e18b1dab-6534-4331-96b1-a6ab7fb69ead req-c947b94f-7989-4b20-9960-742d02460bfc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Refreshing instance network info cache due to event network-changed-864e1646-5abd-4268-a80a-c224425c842d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct 10 06:18:11 np0005479823 nova_compute[235775]: 2025-10-10 10:18:11.950 2 DEBUG oslo_concurrency.lockutils [req-e18b1dab-6534-4331-96b1-a6ab7fb69ead req-c947b94f-7989-4b20-9960-742d02460bfc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "refresh_cache-49a68fc9-f469-4827-9bb8-f2c2981d2b68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct 10 06:18:12 np0005479823 nova_compute[235775]: 2025-10-10 10:18:12.057 2 DEBUG nova.network.neutron [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct 10 06:18:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:18:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:12.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:18:12 np0005479823 nova_compute[235775]: 2025-10-10 10:18:12.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:13.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:13 np0005479823 nova_compute[235775]: 2025-10-10 10:18:13.533 2 DEBUG nova.network.neutron [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Updating instance_info_cache with network_info: [{"id": "864e1646-5abd-4268-a80a-c224425c842d", "address": "fa:16:3e:19:de:db", "network": {"id": "f2187c16-3ad9-4fc6-892a-d36a6262d4d0", "bridge": "br-int", "label": "tempest-network-smoke--807297116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap864e1646-5a", "ovs_interfaceid": "864e1646-5abd-4268-a80a-c224425c842d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct 10 06:18:13 np0005479823 nova_compute[235775]: 2025-10-10 10:18:13.576 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Releasing lock "refresh_cache-49a68fc9-f469-4827-9bb8-f2c2981d2b68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct 10 06:18:13 np0005479823 nova_compute[235775]: 2025-10-10 10:18:13.576 2 DEBUG nova.compute.manager [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Instance network_info: |[{"id": "864e1646-5abd-4268-a80a-c224425c842d", "address": "fa:16:3e:19:de:db", "network": {"id": "f2187c16-3ad9-4fc6-892a-d36a6262d4d0", "bridge": "br-int", "label": "tempest-network-smoke--807297116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap864e1646-5a", "ovs_interfaceid": "864e1646-5abd-4268-a80a-c224425c842d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct 10 06:18:13 np0005479823 nova_compute[235775]: 2025-10-10 10:18:13.577 2 DEBUG oslo_concurrency.lockutils [req-e18b1dab-6534-4331-96b1-a6ab7fb69ead req-c947b94f-7989-4b20-9960-742d02460bfc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquired lock "refresh_cache-49a68fc9-f469-4827-9bb8-f2c2981d2b68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct 10 06:18:13 np0005479823 nova_compute[235775]: 2025-10-10 10:18:13.577 2 DEBUG nova.network.neutron [req-e18b1dab-6534-4331-96b1-a6ab7fb69ead req-c947b94f-7989-4b20-9960-742d02460bfc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Refreshing network info cache for port 864e1646-5abd-4268-a80a-c224425c842d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct 10 06:18:13 np0005479823 nova_compute[235775]: 2025-10-10 10:18:13.579 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Start _get_guest_xml network_info=[{"id": "864e1646-5abd-4268-a80a-c224425c842d", "address": "fa:16:3e:19:de:db", "network": {"id": "f2187c16-3ad9-4fc6-892a-d36a6262d4d0", "bridge": "br-int", "label": "tempest-network-smoke--807297116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap864e1646-5a", "ovs_interfaceid": "864e1646-5abd-4268-a80a-c224425c842d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-10T10:09:50Z,direct_url=<?>,disk_format='qcow2',id=5ae78700-970d-45b4-a57d-978a054c7519,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ec962e275689437d80680ff3ea69c852',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-10T10:09:53Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'size': 0, 'device_type': 'disk', 'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'encryption_options': None, 'image_id': '5ae78700-970d-45b4-a57d-978a054c7519'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct 10 06:18:13 np0005479823 nova_compute[235775]: 2025-10-10 10:18:13.583 2 WARNING nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct 10 06:18:13 np0005479823 nova_compute[235775]: 2025-10-10 10:18:13.587 2 DEBUG nova.virt.libvirt.host [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct 10 06:18:13 np0005479823 nova_compute[235775]: 2025-10-10 10:18:13.588 2 DEBUG nova.virt.libvirt.host [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct 10 06:18:13 np0005479823 nova_compute[235775]: 2025-10-10 10:18:13.591 2 DEBUG nova.virt.libvirt.host [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct 10 06:18:13 np0005479823 nova_compute[235775]: 2025-10-10 10:18:13.592 2 DEBUG nova.virt.libvirt.host [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct 10 06:18:13 np0005479823 nova_compute[235775]: 2025-10-10 10:18:13.592 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct 10 06:18:13 np0005479823 nova_compute[235775]: 2025-10-10 10:18:13.592 2 DEBUG nova.virt.hardware [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-10T10:09:48Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='00373e71-6208-4238-ad85-db0452c53bc6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-10T10:09:50Z,direct_url=<?>,disk_format='qcow2',id=5ae78700-970d-45b4-a57d-978a054c7519,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ec962e275689437d80680ff3ea69c852',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-10T10:09:53Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct 10 06:18:13 np0005479823 nova_compute[235775]: 2025-10-10 10:18:13.593 2 DEBUG nova.virt.hardware [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct 10 06:18:13 np0005479823 nova_compute[235775]: 2025-10-10 10:18:13.593 2 DEBUG nova.virt.hardware [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct 10 06:18:13 np0005479823 nova_compute[235775]: 2025-10-10 10:18:13.593 2 DEBUG nova.virt.hardware [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct 10 06:18:13 np0005479823 nova_compute[235775]: 2025-10-10 10:18:13.593 2 DEBUG nova.virt.hardware [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct 10 06:18:13 np0005479823 nova_compute[235775]: 2025-10-10 10:18:13.593 2 DEBUG nova.virt.hardware [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct 10 06:18:13 np0005479823 nova_compute[235775]: 2025-10-10 10:18:13.594 2 DEBUG nova.virt.hardware [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct 10 06:18:13 np0005479823 nova_compute[235775]: 2025-10-10 10:18:13.594 2 DEBUG nova.virt.hardware [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct 10 06:18:13 np0005479823 nova_compute[235775]: 2025-10-10 10:18:13.594 2 DEBUG nova.virt.hardware [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct 10 06:18:13 np0005479823 nova_compute[235775]: 2025-10-10 10:18:13.594 2 DEBUG nova.virt.hardware [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct 10 06:18:13 np0005479823 nova_compute[235775]: 2025-10-10 10:18:13.594 2 DEBUG nova.virt.hardware [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct 10 06:18:13 np0005479823 nova_compute[235775]: 2025-10-10 10:18:13.597 2 DEBUG oslo_concurrency.processutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:18:13 np0005479823 podman[245214]: 2025-10-10 10:18:13.79389691 +0000 UTC m=+0.065499978 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Oct 10 06:18:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:14 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Oct 10 06:18:14 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2249811561' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.071 2 DEBUG oslo_concurrency.processutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.094 2 DEBUG nova.storage.rbd_utils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image 49a68fc9-f469-4827-9bb8-f2c2981d2b68_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.098 2 DEBUG oslo_concurrency.processutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:18:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:14.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:14 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Oct 10 06:18:14 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1035205217' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.612 2 DEBUG oslo_concurrency.processutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.514s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.613 2 DEBUG nova.virt.libvirt.vif [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-10T10:18:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-681617856',display_name='tempest-TestNetworkBasicOps-server-681617856',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-681617856',id=9,image_ref='5ae78700-970d-45b4-a57d-978a054c7519',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBIcYTEXDYtAk18KooLsNGiBbJHsQVG+1VrBdrz3ofp65nb477sGHgmoQEtvfZnvU1CDeiIFLoTRDtJRom4RiTMzgyKw8lTmf0SFcI9wASAJTcgKdt8HRVl+kZ8Ero4zmQ==',key_name='tempest-TestNetworkBasicOps-1805593060',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d5e531d4b440422d946eaf6fd4e166f7',ramdisk_id='',reservation_id='r-0k9ji85m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5ae78700-970d-45b4-a57d-978a054c7519',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-188749107',owner_user_name='tempest-TestNetworkBasicOps-188749107-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-10T10:18:09Z,user_data=None,user_id='7956778c03764aaf8906c9b435337976',uuid=49a68fc9-f469-4827-9bb8-f2c2981d2b68,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "864e1646-5abd-4268-a80a-c224425c842d", "address": "fa:16:3e:19:de:db", "network": {"id": "f2187c16-3ad9-4fc6-892a-d36a6262d4d0", "bridge": "br-int", "label": "tempest-network-smoke--807297116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap864e1646-5a", "ovs_interfaceid": "864e1646-5abd-4268-a80a-c224425c842d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.614 2 DEBUG nova.network.os_vif_util [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converting VIF {"id": "864e1646-5abd-4268-a80a-c224425c842d", "address": "fa:16:3e:19:de:db", "network": {"id": "f2187c16-3ad9-4fc6-892a-d36a6262d4d0", "bridge": "br-int", "label": "tempest-network-smoke--807297116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap864e1646-5a", "ovs_interfaceid": "864e1646-5abd-4268-a80a-c224425c842d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.615 2 DEBUG nova.network.os_vif_util [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:de:db,bridge_name='br-int',has_traffic_filtering=True,id=864e1646-5abd-4268-a80a-c224425c842d,network=Network(f2187c16-3ad9-4fc6-892a-d36a6262d4d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap864e1646-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.615 2 DEBUG nova.objects.instance [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 49a68fc9-f469-4827-9bb8-f2c2981d2b68 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.637 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] End _get_guest_xml xml=<domain type="kvm">
Oct 10 06:18:14 np0005479823 nova_compute[235775]:  <uuid>49a68fc9-f469-4827-9bb8-f2c2981d2b68</uuid>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:  <name>instance-00000009</name>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:  <memory>131072</memory>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:  <vcpu>1</vcpu>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:  <metadata>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <nova:name>tempest-TestNetworkBasicOps-server-681617856</nova:name>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <nova:creationTime>2025-10-10 10:18:13</nova:creationTime>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <nova:flavor name="m1.nano">
Oct 10 06:18:14 np0005479823 nova_compute[235775]:        <nova:memory>128</nova:memory>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:        <nova:disk>1</nova:disk>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:        <nova:swap>0</nova:swap>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:        <nova:ephemeral>0</nova:ephemeral>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:        <nova:vcpus>1</nova:vcpus>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      </nova:flavor>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <nova:owner>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:        <nova:user uuid="7956778c03764aaf8906c9b435337976">tempest-TestNetworkBasicOps-188749107-project-member</nova:user>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:        <nova:project uuid="d5e531d4b440422d946eaf6fd4e166f7">tempest-TestNetworkBasicOps-188749107</nova:project>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      </nova:owner>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <nova:root type="image" uuid="5ae78700-970d-45b4-a57d-978a054c7519"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <nova:ports>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:        <nova:port uuid="864e1646-5abd-4268-a80a-c224425c842d">
Oct 10 06:18:14 np0005479823 nova_compute[235775]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:        </nova:port>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      </nova:ports>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    </nova:instance>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:  </metadata>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:  <sysinfo type="smbios">
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <system>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <entry name="manufacturer">RDO</entry>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <entry name="product">OpenStack Compute</entry>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <entry name="serial">49a68fc9-f469-4827-9bb8-f2c2981d2b68</entry>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <entry name="uuid">49a68fc9-f469-4827-9bb8-f2c2981d2b68</entry>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <entry name="family">Virtual Machine</entry>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    </system>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:  </sysinfo>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:  <os>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <type arch="x86_64" machine="q35">hvm</type>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <boot dev="hd"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <smbios mode="sysinfo"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:  </os>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:  <features>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <acpi/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <apic/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <vmcoreinfo/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:  </features>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:  <clock offset="utc">
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <timer name="pit" tickpolicy="delay"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <timer name="rtc" tickpolicy="catchup"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <timer name="hpet" present="no"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:  </clock>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:  <cpu mode="host-model" match="exact">
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <topology sockets="1" cores="1" threads="1"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:  </cpu>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:  <devices>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <disk type="network" device="disk">
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <driver type="raw" cache="none"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <source protocol="rbd" name="vms/49a68fc9-f469-4827-9bb8-f2c2981d2b68_disk">
Oct 10 06:18:14 np0005479823 nova_compute[235775]:        <host name="192.168.122.100" port="6789"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:        <host name="192.168.122.102" port="6789"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:        <host name="192.168.122.101" port="6789"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      </source>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <auth username="openstack">
Oct 10 06:18:14 np0005479823 nova_compute[235775]:        <secret type="ceph" uuid="21f084a3-af34-5230-afe4-ea5cd24a55f4"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      </auth>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <target dev="vda" bus="virtio"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    </disk>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <disk type="network" device="cdrom">
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <driver type="raw" cache="none"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <source protocol="rbd" name="vms/49a68fc9-f469-4827-9bb8-f2c2981d2b68_disk.config">
Oct 10 06:18:14 np0005479823 nova_compute[235775]:        <host name="192.168.122.100" port="6789"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:        <host name="192.168.122.102" port="6789"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:        <host name="192.168.122.101" port="6789"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      </source>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <auth username="openstack">
Oct 10 06:18:14 np0005479823 nova_compute[235775]:        <secret type="ceph" uuid="21f084a3-af34-5230-afe4-ea5cd24a55f4"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      </auth>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <target dev="sda" bus="sata"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    </disk>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <interface type="ethernet">
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <mac address="fa:16:3e:19:de:db"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <model type="virtio"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <driver name="vhost" rx_queue_size="512"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <mtu size="1442"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <target dev="tap864e1646-5a"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    </interface>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <serial type="pty">
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <log file="/var/lib/nova/instances/49a68fc9-f469-4827-9bb8-f2c2981d2b68/console.log" append="off"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    </serial>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <video>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <model type="virtio"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    </video>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <input type="tablet" bus="usb"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <rng model="virtio">
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <backend model="random">/dev/urandom</backend>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    </rng>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <controller type="usb" index="0"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    <memballoon model="virtio">
Oct 10 06:18:14 np0005479823 nova_compute[235775]:      <stats period="10"/>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:    </memballoon>
Oct 10 06:18:14 np0005479823 nova_compute[235775]:  </devices>
Oct 10 06:18:14 np0005479823 nova_compute[235775]: </domain>
Oct 10 06:18:14 np0005479823 nova_compute[235775]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.638 2 DEBUG nova.compute.manager [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Preparing to wait for external event network-vif-plugged-864e1646-5abd-4268-a80a-c224425c842d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.638 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.638 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.639 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.639 2 DEBUG nova.virt.libvirt.vif [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-10T10:18:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-681617856',display_name='tempest-TestNetworkBasicOps-server-681617856',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-681617856',id=9,image_ref='5ae78700-970d-45b4-a57d-978a054c7519',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBIcYTEXDYtAk18KooLsNGiBbJHsQVG+1VrBdrz3ofp65nb477sGHgmoQEtvfZnvU1CDeiIFLoTRDtJRom4RiTMzgyKw8lTmf0SFcI9wASAJTcgKdt8HRVl+kZ8Ero4zmQ==',key_name='tempest-TestNetworkBasicOps-1805593060',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d5e531d4b440422d946eaf6fd4e166f7',ramdisk_id='',reservation_id='r-0k9ji85m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5ae78700-970d-45b4-a57d-978a054c7519',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-188749107',owner_user_name='tempest-TestNetworkBasicOps-188749107-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-10T10:18:09Z,user_data=None,user_id='7956778c03764aaf8906c9b435337976',uuid=49a68fc9-f469-4827-9bb8-f2c2981d2b68,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "864e1646-5abd-4268-a80a-c224425c842d", "address": "fa:16:3e:19:de:db", "network": {"id": "f2187c16-3ad9-4fc6-892a-d36a6262d4d0", "bridge": "br-int", "label": "tempest-network-smoke--807297116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap864e1646-5a", "ovs_interfaceid": "864e1646-5abd-4268-a80a-c224425c842d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.639 2 DEBUG nova.network.os_vif_util [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converting VIF {"id": "864e1646-5abd-4268-a80a-c224425c842d", "address": "fa:16:3e:19:de:db", "network": {"id": "f2187c16-3ad9-4fc6-892a-d36a6262d4d0", "bridge": "br-int", "label": "tempest-network-smoke--807297116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap864e1646-5a", "ovs_interfaceid": "864e1646-5abd-4268-a80a-c224425c842d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.640 2 DEBUG nova.network.os_vif_util [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:de:db,bridge_name='br-int',has_traffic_filtering=True,id=864e1646-5abd-4268-a80a-c224425c842d,network=Network(f2187c16-3ad9-4fc6-892a-d36a6262d4d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap864e1646-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.640 2 DEBUG os_vif [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:de:db,bridge_name='br-int',has_traffic_filtering=True,id=864e1646-5abd-4268-a80a-c224425c842d,network=Network(f2187c16-3ad9-4fc6-892a-d36a6262d4d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap864e1646-5a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.641 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.641 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.644 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap864e1646-5a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.644 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap864e1646-5a, col_values=(('external_ids', {'iface-id': '864e1646-5abd-4268-a80a-c224425c842d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:19:de:db', 'vm-uuid': '49a68fc9-f469-4827-9bb8-f2c2981d2b68'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:14 np0005479823 NetworkManager[44866]: <info>  [1760091494.6481] manager: (tap864e1646-5a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/33)
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.662 2 INFO os_vif [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:de:db,bridge_name='br-int',has_traffic_filtering=True,id=864e1646-5abd-4268-a80a-c224425c842d,network=Network(f2187c16-3ad9-4fc6-892a-d36a6262d4d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap864e1646-5a')#033[00m
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.724 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.725 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.725 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] No VIF found with MAC fa:16:3e:19:de:db, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.725 2 INFO nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Using config drive#033[00m
Oct 10 06:18:14 np0005479823 nova_compute[235775]: 2025-10-10 10:18:14.751 2 DEBUG nova.storage.rbd_utils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image 49a68fc9-f469-4827-9bb8-f2c2981d2b68_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct 10 06:18:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:18:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:15.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:18:15 np0005479823 nova_compute[235775]: 2025-10-10 10:18:15.241 2 DEBUG nova.network.neutron [req-e18b1dab-6534-4331-96b1-a6ab7fb69ead req-c947b94f-7989-4b20-9960-742d02460bfc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Updated VIF entry in instance network info cache for port 864e1646-5abd-4268-a80a-c224425c842d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct 10 06:18:15 np0005479823 nova_compute[235775]: 2025-10-10 10:18:15.242 2 DEBUG nova.network.neutron [req-e18b1dab-6534-4331-96b1-a6ab7fb69ead req-c947b94f-7989-4b20-9960-742d02460bfc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Updating instance_info_cache with network_info: [{"id": "864e1646-5abd-4268-a80a-c224425c842d", "address": "fa:16:3e:19:de:db", "network": {"id": "f2187c16-3ad9-4fc6-892a-d36a6262d4d0", "bridge": "br-int", "label": "tempest-network-smoke--807297116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap864e1646-5a", "ovs_interfaceid": "864e1646-5abd-4268-a80a-c224425c842d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct 10 06:18:15 np0005479823 nova_compute[235775]: 2025-10-10 10:18:15.261 2 DEBUG oslo_concurrency.lockutils [req-e18b1dab-6534-4331-96b1-a6ab7fb69ead req-c947b94f-7989-4b20-9960-742d02460bfc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Releasing lock "refresh_cache-49a68fc9-f469-4827-9bb8-f2c2981d2b68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct 10 06:18:15 np0005479823 nova_compute[235775]: 2025-10-10 10:18:15.304 2 INFO nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Creating config drive at /var/lib/nova/instances/49a68fc9-f469-4827-9bb8-f2c2981d2b68/disk.config#033[00m
Oct 10 06:18:15 np0005479823 nova_compute[235775]: 2025-10-10 10:18:15.315 2 DEBUG oslo_concurrency.processutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/49a68fc9-f469-4827-9bb8-f2c2981d2b68/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdo0s4rze execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:18:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:18:15 np0005479823 nova_compute[235775]: 2025-10-10 10:18:15.446 2 DEBUG oslo_concurrency.processutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/49a68fc9-f469-4827-9bb8-f2c2981d2b68/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdo0s4rze" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:18:15 np0005479823 nova_compute[235775]: 2025-10-10 10:18:15.484 2 DEBUG nova.storage.rbd_utils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image 49a68fc9-f469-4827-9bb8-f2c2981d2b68_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct 10 06:18:15 np0005479823 nova_compute[235775]: 2025-10-10 10:18:15.490 2 DEBUG oslo_concurrency.processutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/49a68fc9-f469-4827-9bb8-f2c2981d2b68/disk.config 49a68fc9-f469-4827-9bb8-f2c2981d2b68_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:18:15 np0005479823 nova_compute[235775]: 2025-10-10 10:18:15.657 2 DEBUG oslo_concurrency.processutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/49a68fc9-f469-4827-9bb8-f2c2981d2b68/disk.config 49a68fc9-f469-4827-9bb8-f2c2981d2b68_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.168s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:18:15 np0005479823 nova_compute[235775]: 2025-10-10 10:18:15.659 2 INFO nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Deleting local config drive /var/lib/nova/instances/49a68fc9-f469-4827-9bb8-f2c2981d2b68/disk.config because it was imported into RBD.#033[00m
Oct 10 06:18:15 np0005479823 systemd[1]: Starting libvirt secret daemon...
Oct 10 06:18:15 np0005479823 systemd[1]: Started libvirt secret daemon.
Oct 10 06:18:15 np0005479823 kernel: tap864e1646-5a: entered promiscuous mode
Oct 10 06:18:15 np0005479823 NetworkManager[44866]: <info>  [1760091495.7659] manager: (tap864e1646-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/34)
Oct 10 06:18:15 np0005479823 ovn_controller[132503]: 2025-10-10T10:18:15Z|00038|binding|INFO|Claiming lport 864e1646-5abd-4268-a80a-c224425c842d for this chassis.
Oct 10 06:18:15 np0005479823 ovn_controller[132503]: 2025-10-10T10:18:15Z|00039|binding|INFO|864e1646-5abd-4268-a80a-c224425c842d: Claiming fa:16:3e:19:de:db 10.100.0.4
Oct 10 06:18:15 np0005479823 nova_compute[235775]: 2025-10-10 10:18:15.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:15 np0005479823 NetworkManager[44866]: <info>  [1760091495.7871] manager: (patch-br-int-to-provnet-1d90fa58-74cb-4ad4-84e0-739689a69111): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Oct 10 06:18:15 np0005479823 NetworkManager[44866]: <info>  [1760091495.7877] manager: (patch-provnet-1d90fa58-74cb-4ad4-84e0-739689a69111-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Oct 10 06:18:15 np0005479823 nova_compute[235775]: 2025-10-10 10:18:15.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:15 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.793 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:de:db 10.100.0.4'], port_security=['fa:16:3e:19:de:db 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1060241160', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '49a68fc9-f469-4827-9bb8-f2c2981d2b68', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2187c16-3ad9-4fc6-892a-d36a6262d4d0', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1060241160', 'neutron:project_id': 'd5e531d4b440422d946eaf6fd4e166f7', 'neutron:revision_number': '7', 'neutron:security_group_ids': '79abf760-0fb0-448c-b5c8-75027ac31ae3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.195'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=58a83406-32bd-40d9-b3dd-ed56e38abb09, chassis=[<ovs.db.idl.Row object at 0x7fa4a23cd910>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa4a23cd910>], logical_port=864e1646-5abd-4268-a80a-c224425c842d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct 10 06:18:15 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.794 141795 INFO neutron.agent.ovn.metadata.agent [-] Port 864e1646-5abd-4268-a80a-c224425c842d in datapath f2187c16-3ad9-4fc6-892a-d36a6262d4d0 bound to our chassis#033[00m
Oct 10 06:18:15 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.795 141795 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f2187c16-3ad9-4fc6-892a-d36a6262d4d0#033[00m
Oct 10 06:18:15 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.809 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[640d9617-f1f7-4663-b12f-e28577560800]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:18:15 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.810 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf2187c16-31 in ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct 10 06:18:15 np0005479823 systemd-udevd[245387]: Network interface NamePolicy= disabled on kernel command line.
Oct 10 06:18:15 np0005479823 systemd-machined[192768]: New machine qemu-2-instance-00000009.
Oct 10 06:18:15 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.811 241439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf2187c16-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct 10 06:18:15 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.812 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[12875ac3-de6a-40ed-9ec2-fb7dd8e499d7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:18:15 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.812 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[2995be21-5576-4861-b1ba-f3c5c11f802a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:18:15 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.824 141908 DEBUG oslo.privsep.daemon [-] privsep: reply[a86481f6-cf93-446a-9c89-e73b17fba6f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:18:15 np0005479823 NetworkManager[44866]: <info>  [1760091495.8256] device (tap864e1646-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 10 06:18:15 np0005479823 NetworkManager[44866]: <info>  [1760091495.8264] device (tap864e1646-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 10 06:18:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:15 np0005479823 systemd[1]: Started Virtual Machine qemu-2-instance-00000009.
Oct 10 06:18:15 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.851 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[4e50da17-715c-4d84-9faa-5cc7489ca375]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:18:15 np0005479823 nova_compute[235775]: 2025-10-10 10:18:15.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:15 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.878 241521 DEBUG oslo.privsep.daemon [-] privsep: reply[7e732e89-34d4-4ca7-b0bb-37d8c50dd535]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:18:15 np0005479823 nova_compute[235775]: 2025-10-10 10:18:15.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:15 np0005479823 NetworkManager[44866]: <info>  [1760091495.8843] manager: (tapf2187c16-30): new Veth device (/org/freedesktop/NetworkManager/Devices/37)
Oct 10 06:18:15 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.883 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[cca50fbb-8be7-42ed-ae60-ef59b4965de7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:18:15 np0005479823 nova_compute[235775]: 2025-10-10 10:18:15.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:15 np0005479823 ovn_controller[132503]: 2025-10-10T10:18:15Z|00040|binding|INFO|Setting lport 864e1646-5abd-4268-a80a-c224425c842d ovn-installed in OVS
Oct 10 06:18:15 np0005479823 ovn_controller[132503]: 2025-10-10T10:18:15Z|00041|binding|INFO|Setting lport 864e1646-5abd-4268-a80a-c224425c842d up in Southbound
Oct 10 06:18:15 np0005479823 nova_compute[235775]: 2025-10-10 10:18:15.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:15 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.916 241521 DEBUG oslo.privsep.daemon [-] privsep: reply[c97adbd6-2c95-4346-9b48-267a7c5c743e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:18:15 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.918 241521 DEBUG oslo.privsep.daemon [-] privsep: reply[8397f14d-8146-46f8-b077-24f4e861e559]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:18:15 np0005479823 NetworkManager[44866]: <info>  [1760091495.9375] device (tapf2187c16-30): carrier: link connected
Oct 10 06:18:15 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.941 241521 DEBUG oslo.privsep.daemon [-] privsep: reply[d6b1fc67-1f65-430b-a869-b003421dc8a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:18:15 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.956 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[75a0f06b-88ea-4534-bfbc-d8ede8097ee9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf2187c16-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:33:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439949, 'reachable_time': 25639, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245419, 'error': None, 'target': 'ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:18:15 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.967 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[7fdaa929-b5d8-4063-9701-ddcbd88d3d00]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8f:3311'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 439949, 'tstamp': 439949}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245420, 'error': None, 'target': 'ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:18:15 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:15.984 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[38ade2a9-f218-4c73-802d-beb381f85e0a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf2187c16-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8f:33:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439949, 'reachable_time': 25639, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 245421, 'error': None, 'target': 'ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:18:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:18:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:18:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:18:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:16 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:16.026 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[aa61a3b6-eca6-48c4-a8f2-0ff80d6ab3e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:16.101 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[a09de635-44d2-4dbf-9cfa-9f8aad578773]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:16.102 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf2187c16-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:16.103 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:16.103 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf2187c16-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:18:16 np0005479823 kernel: tapf2187c16-30: entered promiscuous mode
Oct 10 06:18:16 np0005479823 nova_compute[235775]: 2025-10-10 10:18:16.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:16 np0005479823 NetworkManager[44866]: <info>  [1760091496.1085] manager: (tapf2187c16-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:16.109 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf2187c16-30, col_values=(('external_ids', {'iface-id': 'e9f075b6-37df-4f28-90c0-0fcdd3460568'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:18:16 np0005479823 nova_compute[235775]: 2025-10-10 10:18:16.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:16 np0005479823 ovn_controller[132503]: 2025-10-10T10:18:16Z|00042|binding|INFO|Releasing lport e9f075b6-37df-4f28-90c0-0fcdd3460568 from this chassis (sb_readonly=0)
Oct 10 06:18:16 np0005479823 nova_compute[235775]: 2025-10-10 10:18:16.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:16.134 141795 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f2187c16-3ad9-4fc6-892a-d36a6262d4d0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f2187c16-3ad9-4fc6-892a-d36a6262d4d0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:16.135 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[fe067dea-f1c8-4370-823a-e9e6c9c6ee68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:16.135 141795 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]: global
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]:    log         /dev/log local0 debug
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]:    log-tag     haproxy-metadata-proxy-f2187c16-3ad9-4fc6-892a-d36a6262d4d0
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]:    user        root
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]:    group       root
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]:    maxconn     1024
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]:    pidfile     /var/lib/neutron/external/pids/f2187c16-3ad9-4fc6-892a-d36a6262d4d0.pid.haproxy
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]:    daemon
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]: 
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]: defaults
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]:    log global
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]:    mode http
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]:    option httplog
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]:    option dontlognull
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]:    option http-server-close
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]:    option forwardfor
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]:    retries                 3
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]:    timeout http-request    30s
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]:    timeout connect         30s
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]:    timeout client          32s
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]:    timeout server          32s
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]:    timeout http-keep-alive 30s
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]: 
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]: 
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]: listen listener
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]:    bind 169.254.169.254:80
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]:    server metadata /var/lib/neutron/metadata_proxy
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]:    http-request add-header X-OVN-Network-ID f2187c16-3ad9-4fc6-892a-d36a6262d4d0
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct 10 06:18:16 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:16.136 141795 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0', 'env', 'PROCESS_TAG=haproxy-f2187c16-3ad9-4fc6-892a-d36a6262d4d0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f2187c16-3ad9-4fc6-892a-d36a6262d4d0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct 10 06:18:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:18:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:16.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:18:16 np0005479823 podman[245495]: 2025-10-10 10:18:16.489941313 +0000 UTC m=+0.051034275 container create 0a6fa5a7bdf7f2e3a93eee8c17ed7bc812493ca797de0729ea978e83ba81c5ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 10 06:18:16 np0005479823 systemd[1]: Started libpod-conmon-0a6fa5a7bdf7f2e3a93eee8c17ed7bc812493ca797de0729ea978e83ba81c5ca.scope.
Oct 10 06:18:16 np0005479823 podman[245495]: 2025-10-10 10:18:16.46362223 +0000 UTC m=+0.024715212 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct 10 06:18:16 np0005479823 systemd[1]: Started libcrun container.
Oct 10 06:18:16 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02920de339ab3d96609b58c8fc65fe45954dd283163f9dcf7301f5c71f47af34/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 10 06:18:16 np0005479823 podman[245495]: 2025-10-10 10:18:16.58228024 +0000 UTC m=+0.143373212 container init 0a6fa5a7bdf7f2e3a93eee8c17ed7bc812493ca797de0729ea978e83ba81c5ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 10 06:18:16 np0005479823 podman[245495]: 2025-10-10 10:18:16.589863762 +0000 UTC m=+0.150956714 container start 0a6fa5a7bdf7f2e3a93eee8c17ed7bc812493ca797de0729ea978e83ba81c5ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 10 06:18:16 np0005479823 neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0[245510]: [NOTICE]   (245514) : New worker (245517) forked
Oct 10 06:18:16 np0005479823 neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0[245510]: [NOTICE]   (245514) : Loading success.
Oct 10 06:18:16 np0005479823 nova_compute[235775]: 2025-10-10 10:18:16.627 2 DEBUG nova.virt.driver [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Emitting event <LifecycleEvent: 1760091496.6262186, 49a68fc9-f469-4827-9bb8-f2c2981d2b68 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct 10 06:18:16 np0005479823 nova_compute[235775]: 2025-10-10 10:18:16.627 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] VM Started (Lifecycle Event)#033[00m
Oct 10 06:18:16 np0005479823 nova_compute[235775]: 2025-10-10 10:18:16.656 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct 10 06:18:16 np0005479823 nova_compute[235775]: 2025-10-10 10:18:16.662 2 DEBUG nova.virt.driver [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Emitting event <LifecycleEvent: 1760091496.627454, 49a68fc9-f469-4827-9bb8-f2c2981d2b68 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct 10 06:18:16 np0005479823 nova_compute[235775]: 2025-10-10 10:18:16.663 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] VM Paused (Lifecycle Event)#033[00m
Oct 10 06:18:16 np0005479823 nova_compute[235775]: 2025-10-10 10:18:16.684 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct 10 06:18:16 np0005479823 nova_compute[235775]: 2025-10-10 10:18:16.689 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct 10 06:18:16 np0005479823 nova_compute[235775]: 2025-10-10 10:18:16.716 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct 10 06:18:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:17.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:17 np0005479823 nova_compute[235775]: 2025-10-10 10:18:17.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:17 np0005479823 nova_compute[235775]: 2025-10-10 10:18:17.911 2 DEBUG nova.compute.manager [req-798379c4-24d0-4d4f-8bc9-120affb2373e req-8a24784a-2e80-40d9-a76b-af77d223ff2b 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Received event network-vif-plugged-864e1646-5abd-4268-a80a-c224425c842d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct 10 06:18:17 np0005479823 nova_compute[235775]: 2025-10-10 10:18:17.911 2 DEBUG oslo_concurrency.lockutils [req-798379c4-24d0-4d4f-8bc9-120affb2373e req-8a24784a-2e80-40d9-a76b-af77d223ff2b 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:18:17 np0005479823 nova_compute[235775]: 2025-10-10 10:18:17.912 2 DEBUG oslo_concurrency.lockutils [req-798379c4-24d0-4d4f-8bc9-120affb2373e req-8a24784a-2e80-40d9-a76b-af77d223ff2b 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:18:17 np0005479823 nova_compute[235775]: 2025-10-10 10:18:17.912 2 DEBUG oslo_concurrency.lockutils [req-798379c4-24d0-4d4f-8bc9-120affb2373e req-8a24784a-2e80-40d9-a76b-af77d223ff2b 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:18:17 np0005479823 nova_compute[235775]: 2025-10-10 10:18:17.912 2 DEBUG nova.compute.manager [req-798379c4-24d0-4d4f-8bc9-120affb2373e req-8a24784a-2e80-40d9-a76b-af77d223ff2b 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Processing event network-vif-plugged-864e1646-5abd-4268-a80a-c224425c842d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct 10 06:18:17 np0005479823 nova_compute[235775]: 2025-10-10 10:18:17.913 2 DEBUG nova.compute.manager [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct 10 06:18:17 np0005479823 nova_compute[235775]: 2025-10-10 10:18:17.916 2 DEBUG nova.virt.driver [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Emitting event <LifecycleEvent: 1760091497.9161136, 49a68fc9-f469-4827-9bb8-f2c2981d2b68 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct 10 06:18:17 np0005479823 nova_compute[235775]: 2025-10-10 10:18:17.916 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] VM Resumed (Lifecycle Event)#033[00m
Oct 10 06:18:17 np0005479823 nova_compute[235775]: 2025-10-10 10:18:17.918 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct 10 06:18:17 np0005479823 nova_compute[235775]: 2025-10-10 10:18:17.922 2 INFO nova.virt.libvirt.driver [-] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Instance spawned successfully.#033[00m
Oct 10 06:18:17 np0005479823 nova_compute[235775]: 2025-10-10 10:18:17.922 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct 10 06:18:17 np0005479823 nova_compute[235775]: 2025-10-10 10:18:17.942 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct 10 06:18:17 np0005479823 nova_compute[235775]: 2025-10-10 10:18:17.951 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct 10 06:18:17 np0005479823 nova_compute[235775]: 2025-10-10 10:18:17.958 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct 10 06:18:17 np0005479823 nova_compute[235775]: 2025-10-10 10:18:17.959 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct 10 06:18:17 np0005479823 nova_compute[235775]: 2025-10-10 10:18:17.960 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct 10 06:18:17 np0005479823 nova_compute[235775]: 2025-10-10 10:18:17.960 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct 10 06:18:17 np0005479823 nova_compute[235775]: 2025-10-10 10:18:17.961 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct 10 06:18:17 np0005479823 nova_compute[235775]: 2025-10-10 10:18:17.962 2 DEBUG nova.virt.libvirt.driver [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct 10 06:18:17 np0005479823 nova_compute[235775]: 2025-10-10 10:18:17.974 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct 10 06:18:18 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:18.023 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'da:dc:6a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '42:2f:dd:4e:d8:41'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct 10 06:18:18 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:18.024 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct 10 06:18:18 np0005479823 nova_compute[235775]: 2025-10-10 10:18:18.025 2 INFO nova.compute.manager [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Took 8.84 seconds to spawn the instance on the hypervisor.#033[00m
Oct 10 06:18:18 np0005479823 nova_compute[235775]: 2025-10-10 10:18:18.026 2 DEBUG nova.compute.manager [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct 10 06:18:18 np0005479823 nova_compute[235775]: 2025-10-10 10:18:18.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:18 np0005479823 nova_compute[235775]: 2025-10-10 10:18:18.096 2 INFO nova.compute.manager [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Took 9.82 seconds to build instance.#033[00m
Oct 10 06:18:18 np0005479823 nova_compute[235775]: 2025-10-10 10:18:18.121 2 DEBUG oslo_concurrency.lockutils [None req-b12368b2-b191-4843-a0c7-de737b39363b 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:18:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:18.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:19.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:19 np0005479823 nova_compute[235775]: 2025-10-10 10:18:19.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:20 np0005479823 nova_compute[235775]: 2025-10-10 10:18:20.000 2 DEBUG nova.compute.manager [req-3041c46d-dd2a-49ed-8b70-dba51c18e83d req-d343a51f-0122-42d6-82e7-de71882c631f 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Received event network-vif-plugged-864e1646-5abd-4268-a80a-c224425c842d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct 10 06:18:20 np0005479823 nova_compute[235775]: 2025-10-10 10:18:20.000 2 DEBUG oslo_concurrency.lockutils [req-3041c46d-dd2a-49ed-8b70-dba51c18e83d req-d343a51f-0122-42d6-82e7-de71882c631f 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:18:20 np0005479823 nova_compute[235775]: 2025-10-10 10:18:20.001 2 DEBUG oslo_concurrency.lockutils [req-3041c46d-dd2a-49ed-8b70-dba51c18e83d req-d343a51f-0122-42d6-82e7-de71882c631f 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:18:20 np0005479823 nova_compute[235775]: 2025-10-10 10:18:20.001 2 DEBUG oslo_concurrency.lockutils [req-3041c46d-dd2a-49ed-8b70-dba51c18e83d req-d343a51f-0122-42d6-82e7-de71882c631f 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:18:20 np0005479823 nova_compute[235775]: 2025-10-10 10:18:20.002 2 DEBUG nova.compute.manager [req-3041c46d-dd2a-49ed-8b70-dba51c18e83d req-d343a51f-0122-42d6-82e7-de71882c631f 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] No waiting events found dispatching network-vif-plugged-864e1646-5abd-4268-a80a-c224425c842d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct 10 06:18:20 np0005479823 nova_compute[235775]: 2025-10-10 10:18:20.002 2 WARNING nova.compute.manager [req-3041c46d-dd2a-49ed-8b70-dba51c18e83d req-d343a51f-0122-42d6-82e7-de71882c631f 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Received unexpected event network-vif-plugged-864e1646-5abd-4268-a80a-c224425c842d for instance with vm_state active and task_state None.#033[00m
Oct 10 06:18:20 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:20.027 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:18:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:18:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:18:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:20.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:18:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:18:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:18:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:18:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:21 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:18:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:21.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:22.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:22 np0005479823 nova_compute[235775]: 2025-10-10 10:18:22.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.073 2 DEBUG oslo_concurrency.lockutils [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.074 2 DEBUG oslo_concurrency.lockutils [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.074 2 DEBUG oslo_concurrency.lockutils [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.075 2 DEBUG oslo_concurrency.lockutils [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.075 2 DEBUG oslo_concurrency.lockutils [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.077 2 INFO nova.compute.manager [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Terminating instance#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.079 2 DEBUG nova.compute.manager [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct 10 06:18:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:18:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:23.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:18:23 np0005479823 kernel: tap864e1646-5a (unregistering): left promiscuous mode
Oct 10 06:18:23 np0005479823 NetworkManager[44866]: <info>  [1760091503.1188] device (tap864e1646-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 10 06:18:23 np0005479823 ovn_controller[132503]: 2025-10-10T10:18:23Z|00043|binding|INFO|Releasing lport 864e1646-5abd-4268-a80a-c224425c842d from this chassis (sb_readonly=0)
Oct 10 06:18:23 np0005479823 ovn_controller[132503]: 2025-10-10T10:18:23Z|00044|binding|INFO|Setting lport 864e1646-5abd-4268-a80a-c224425c842d down in Southbound
Oct 10 06:18:23 np0005479823 ovn_controller[132503]: 2025-10-10T10:18:23Z|00045|binding|INFO|Removing iface tap864e1646-5a ovn-installed in OVS
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:23 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:23.133 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:de:db 10.100.0.4'], port_security=['fa:16:3e:19:de:db 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1060241160', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '49a68fc9-f469-4827-9bb8-f2c2981d2b68', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2187c16-3ad9-4fc6-892a-d36a6262d4d0', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1060241160', 'neutron:project_id': 'd5e531d4b440422d946eaf6fd4e166f7', 'neutron:revision_number': '9', 'neutron:security_group_ids': '79abf760-0fb0-448c-b5c8-75027ac31ae3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.195', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=58a83406-32bd-40d9-b3dd-ed56e38abb09, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa4a23cd910>], logical_port=864e1646-5abd-4268-a80a-c224425c842d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa4a23cd910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct 10 06:18:23 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:23.134 141795 INFO neutron.agent.ovn.metadata.agent [-] Port 864e1646-5abd-4268-a80a-c224425c842d in datapath f2187c16-3ad9-4fc6-892a-d36a6262d4d0 unbound from our chassis#033[00m
Oct 10 06:18:23 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:23.135 141795 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f2187c16-3ad9-4fc6-892a-d36a6262d4d0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct 10 06:18:23 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:23.136 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[4eca5dc8-6509-42ee-a65b-85705c974ee9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:18:23 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:23.136 141795 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0 namespace which is not needed anymore#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:23 np0005479823 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000009.scope: Deactivated successfully.
Oct 10 06:18:23 np0005479823 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000009.scope: Consumed 6.032s CPU time.
Oct 10 06:18:23 np0005479823 systemd-machined[192768]: Machine qemu-2-instance-00000009 terminated.
Oct 10 06:18:23 np0005479823 neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0[245510]: [NOTICE]   (245514) : haproxy version is 2.8.14-c23fe91
Oct 10 06:18:23 np0005479823 neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0[245510]: [NOTICE]   (245514) : path to executable is /usr/sbin/haproxy
Oct 10 06:18:23 np0005479823 neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0[245510]: [WARNING]  (245514) : Exiting Master process...
Oct 10 06:18:23 np0005479823 neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0[245510]: [WARNING]  (245514) : Exiting Master process...
Oct 10 06:18:23 np0005479823 neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0[245510]: [ALERT]    (245514) : Current worker (245517) exited with code 143 (Terminated)
Oct 10 06:18:23 np0005479823 neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0[245510]: [WARNING]  (245514) : All workers exited. Exiting... (0)
Oct 10 06:18:23 np0005479823 systemd[1]: libpod-0a6fa5a7bdf7f2e3a93eee8c17ed7bc812493ca797de0729ea978e83ba81c5ca.scope: Deactivated successfully.
Oct 10 06:18:23 np0005479823 podman[245582]: 2025-10-10 10:18:23.256878559 +0000 UTC m=+0.038074430 container died 0a6fa5a7bdf7f2e3a93eee8c17ed7bc812493ca797de0729ea978e83ba81c5ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 10 06:18:23 np0005479823 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0a6fa5a7bdf7f2e3a93eee8c17ed7bc812493ca797de0729ea978e83ba81c5ca-userdata-shm.mount: Deactivated successfully.
Oct 10 06:18:23 np0005479823 systemd[1]: var-lib-containers-storage-overlay-02920de339ab3d96609b58c8fc65fe45954dd283163f9dcf7301f5c71f47af34-merged.mount: Deactivated successfully.
Oct 10 06:18:23 np0005479823 podman[245582]: 2025-10-10 10:18:23.289099761 +0000 UTC m=+0.070295612 container cleanup 0a6fa5a7bdf7f2e3a93eee8c17ed7bc812493ca797de0729ea978e83ba81c5ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 06:18:23 np0005479823 systemd[1]: libpod-conmon-0a6fa5a7bdf7f2e3a93eee8c17ed7bc812493ca797de0729ea978e83ba81c5ca.scope: Deactivated successfully.
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.315 2 INFO nova.virt.libvirt.driver [-] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Instance destroyed successfully.#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.316 2 DEBUG nova.objects.instance [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lazy-loading 'resources' on Instance uuid 49a68fc9-f469-4827-9bb8-f2c2981d2b68 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.339 2 DEBUG nova.virt.libvirt.vif [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-10T10:18:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-681617856',display_name='tempest-TestNetworkBasicOps-server-681617856',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-681617856',id=9,image_ref='5ae78700-970d-45b4-a57d-978a054c7519',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBIcYTEXDYtAk18KooLsNGiBbJHsQVG+1VrBdrz3ofp65nb477sGHgmoQEtvfZnvU1CDeiIFLoTRDtJRom4RiTMzgyKw8lTmf0SFcI9wASAJTcgKdt8HRVl+kZ8Ero4zmQ==',key_name='tempest-TestNetworkBasicOps-1805593060',keypairs=<?>,launch_index=0,launched_at=2025-10-10T10:18:18Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d5e531d4b440422d946eaf6fd4e166f7',ramdisk_id='',reservation_id='r-0k9ji85m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5ae78700-970d-45b4-a57d-978a054c7519',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-188749107',owner_user_name='tempest-TestNetworkBasicOps-188749107-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-10T10:18:18Z,user_data=None,user_id='7956778c03764aaf8906c9b435337976',uuid=49a68fc9-f469-4827-9bb8-f2c2981d2b68,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "864e1646-5abd-4268-a80a-c224425c842d", "address": "fa:16:3e:19:de:db", "network": {"id": "f2187c16-3ad9-4fc6-892a-d36a6262d4d0", "bridge": "br-int", "label": "tempest-network-smoke--807297116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap864e1646-5a", "ovs_interfaceid": "864e1646-5abd-4268-a80a-c224425c842d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.339 2 DEBUG nova.network.os_vif_util [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converting VIF {"id": "864e1646-5abd-4268-a80a-c224425c842d", "address": "fa:16:3e:19:de:db", "network": {"id": "f2187c16-3ad9-4fc6-892a-d36a6262d4d0", "bridge": "br-int", "label": "tempest-network-smoke--807297116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap864e1646-5a", "ovs_interfaceid": "864e1646-5abd-4268-a80a-c224425c842d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.340 2 DEBUG nova.network.os_vif_util [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:de:db,bridge_name='br-int',has_traffic_filtering=True,id=864e1646-5abd-4268-a80a-c224425c842d,network=Network(f2187c16-3ad9-4fc6-892a-d36a6262d4d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap864e1646-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.340 2 DEBUG os_vif [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:de:db,bridge_name='br-int',has_traffic_filtering=True,id=864e1646-5abd-4268-a80a-c224425c842d,network=Network(f2187c16-3ad9-4fc6-892a-d36a6262d4d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap864e1646-5a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.342 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap864e1646-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.346 2 INFO os_vif [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:de:db,bridge_name='br-int',has_traffic_filtering=True,id=864e1646-5abd-4268-a80a-c224425c842d,network=Network(f2187c16-3ad9-4fc6-892a-d36a6262d4d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap864e1646-5a')#033[00m
Oct 10 06:18:23 np0005479823 podman[245613]: 2025-10-10 10:18:23.356868011 +0000 UTC m=+0.044469025 container remove 0a6fa5a7bdf7f2e3a93eee8c17ed7bc812493ca797de0729ea978e83ba81c5ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct 10 06:18:23 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:23.362 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[41584cba-21c7-4ac6-aedc-e82787b8001e]: (4, ('Fri Oct 10 10:18:23 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0 (0a6fa5a7bdf7f2e3a93eee8c17ed7bc812493ca797de0729ea978e83ba81c5ca)\n0a6fa5a7bdf7f2e3a93eee8c17ed7bc812493ca797de0729ea978e83ba81c5ca\nFri Oct 10 10:18:23 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0 (0a6fa5a7bdf7f2e3a93eee8c17ed7bc812493ca797de0729ea978e83ba81c5ca)\n0a6fa5a7bdf7f2e3a93eee8c17ed7bc812493ca797de0729ea978e83ba81c5ca\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:18:23 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:23.364 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[34c7922b-abdf-4f2e-bcc1-ce881e3f60c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:18:23 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:23.365 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf2187c16-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:23 np0005479823 kernel: tapf2187c16-30: left promiscuous mode
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:23 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:23.371 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[f30d7ee1-fc79-43a9-be11-d84b11c523a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:23 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:23.401 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[1565ba04-aec2-4bde-b5a0-e75e2260d018]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:18:23 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:23.402 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[e3f51322-4f75-4f64-ae68-0934207d6ed5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:18:23 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:23.415 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[b44148eb-32db-4806-a959-a398203125f4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439943, 'reachable_time': 38988, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245652, 'error': None, 'target': 'ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:18:23 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:23.417 141908 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f2187c16-3ad9-4fc6-892a-d36a6262d4d0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct 10 06:18:23 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:23.417 141908 DEBUG oslo.privsep.daemon [-] privsep: reply[b2753098-17f6-4f0a-a8bf-5fabe076e43c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:18:23 np0005479823 systemd[1]: run-netns-ovnmeta\x2df2187c16\x2d3ad9\x2d4fc6\x2d892a\x2dd36a6262d4d0.mount: Deactivated successfully.
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.439 2 DEBUG nova.compute.manager [req-fb7d6de3-ce71-4f88-b1c0-fb9124a81712 req-4411a871-8219-4d71-a34c-5b73af702555 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Received event network-vif-unplugged-864e1646-5abd-4268-a80a-c224425c842d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.440 2 DEBUG oslo_concurrency.lockutils [req-fb7d6de3-ce71-4f88-b1c0-fb9124a81712 req-4411a871-8219-4d71-a34c-5b73af702555 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.441 2 DEBUG oslo_concurrency.lockutils [req-fb7d6de3-ce71-4f88-b1c0-fb9124a81712 req-4411a871-8219-4d71-a34c-5b73af702555 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.441 2 DEBUG oslo_concurrency.lockutils [req-fb7d6de3-ce71-4f88-b1c0-fb9124a81712 req-4411a871-8219-4d71-a34c-5b73af702555 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.441 2 DEBUG nova.compute.manager [req-fb7d6de3-ce71-4f88-b1c0-fb9124a81712 req-4411a871-8219-4d71-a34c-5b73af702555 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] No waiting events found dispatching network-vif-unplugged-864e1646-5abd-4268-a80a-c224425c842d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.442 2 DEBUG nova.compute.manager [req-fb7d6de3-ce71-4f88-b1c0-fb9124a81712 req-4411a871-8219-4d71-a34c-5b73af702555 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Received event network-vif-unplugged-864e1646-5abd-4268-a80a-c224425c842d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.698 2 INFO nova.virt.libvirt.driver [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Deleting instance files /var/lib/nova/instances/49a68fc9-f469-4827-9bb8-f2c2981d2b68_del#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.700 2 INFO nova.virt.libvirt.driver [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Deletion of /var/lib/nova/instances/49a68fc9-f469-4827-9bb8-f2c2981d2b68_del complete#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.753 2 INFO nova.compute.manager [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Took 0.67 seconds to destroy the instance on the hypervisor.#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.754 2 DEBUG oslo.service.loopingcall [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.755 2 DEBUG nova.compute.manager [-] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct 10 06:18:23 np0005479823 nova_compute[235775]: 2025-10-10 10:18:23.755 2 DEBUG nova.network.neutron [-] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct 10 06:18:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:24.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:25.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:25 np0005479823 nova_compute[235775]: 2025-10-10 10:18:25.152 2 DEBUG nova.network.neutron [-] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct 10 06:18:25 np0005479823 nova_compute[235775]: 2025-10-10 10:18:25.170 2 INFO nova.compute.manager [-] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Took 1.41 seconds to deallocate network for instance.#033[00m
Oct 10 06:18:25 np0005479823 nova_compute[235775]: 2025-10-10 10:18:25.221 2 DEBUG oslo_concurrency.lockutils [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:18:25 np0005479823 nova_compute[235775]: 2025-10-10 10:18:25.222 2 DEBUG oslo_concurrency.lockutils [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:18:25 np0005479823 nova_compute[235775]: 2025-10-10 10:18:25.303 2 DEBUG oslo_concurrency.processutils [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:18:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:18:25 np0005479823 nova_compute[235775]: 2025-10-10 10:18:25.542 2 DEBUG nova.compute.manager [req-f38f91dd-8c86-4e98-ada8-4067305712c4 req-739e08a3-90f5-4bdb-ac1e-e82c69128054 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Received event network-vif-plugged-864e1646-5abd-4268-a80a-c224425c842d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct 10 06:18:25 np0005479823 nova_compute[235775]: 2025-10-10 10:18:25.543 2 DEBUG oslo_concurrency.lockutils [req-f38f91dd-8c86-4e98-ada8-4067305712c4 req-739e08a3-90f5-4bdb-ac1e-e82c69128054 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:18:25 np0005479823 nova_compute[235775]: 2025-10-10 10:18:25.543 2 DEBUG oslo_concurrency.lockutils [req-f38f91dd-8c86-4e98-ada8-4067305712c4 req-739e08a3-90f5-4bdb-ac1e-e82c69128054 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:18:25 np0005479823 nova_compute[235775]: 2025-10-10 10:18:25.544 2 DEBUG oslo_concurrency.lockutils [req-f38f91dd-8c86-4e98-ada8-4067305712c4 req-739e08a3-90f5-4bdb-ac1e-e82c69128054 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:18:25 np0005479823 nova_compute[235775]: 2025-10-10 10:18:25.544 2 DEBUG nova.compute.manager [req-f38f91dd-8c86-4e98-ada8-4067305712c4 req-739e08a3-90f5-4bdb-ac1e-e82c69128054 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] No waiting events found dispatching network-vif-plugged-864e1646-5abd-4268-a80a-c224425c842d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct 10 06:18:25 np0005479823 nova_compute[235775]: 2025-10-10 10:18:25.544 2 WARNING nova.compute.manager [req-f38f91dd-8c86-4e98-ada8-4067305712c4 req-739e08a3-90f5-4bdb-ac1e-e82c69128054 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Received unexpected event network-vif-plugged-864e1646-5abd-4268-a80a-c224425c842d for instance with vm_state deleted and task_state None.#033[00m
Oct 10 06:18:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:18:25 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1600136722' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:18:25 np0005479823 nova_compute[235775]: 2025-10-10 10:18:25.787 2 DEBUG oslo_concurrency.processutils [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:18:25 np0005479823 nova_compute[235775]: 2025-10-10 10:18:25.795 2 DEBUG nova.compute.provider_tree [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct 10 06:18:25 np0005479823 nova_compute[235775]: 2025-10-10 10:18:25.816 2 DEBUG nova.scheduler.client.report [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct 10 06:18:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:25 np0005479823 nova_compute[235775]: 2025-10-10 10:18:25.844 2 DEBUG oslo_concurrency.lockutils [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.623s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:18:25 np0005479823 nova_compute[235775]: 2025-10-10 10:18:25.893 2 INFO nova.scheduler.client.report [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Deleted allocations for instance 49a68fc9-f469-4827-9bb8-f2c2981d2b68#033[00m
Oct 10 06:18:25 np0005479823 nova_compute[235775]: 2025-10-10 10:18:25.970 2 DEBUG oslo_concurrency.lockutils [None req-93dc4d03-f063-44ce-9ef9-04bbaa4fd463 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "49a68fc9-f469-4827-9bb8-f2c2981d2b68" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.896s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:18:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:18:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:18:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:18:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:26 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:18:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:18:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:26.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:18:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Oct 10 06:18:26 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1965479803' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 06:18:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Oct 10 06:18:26 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1965479803' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 06:18:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:27.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:27 np0005479823 nova_compute[235775]: 2025-10-10 10:18:27.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:28 np0005479823 nova_compute[235775]: 2025-10-10 10:18:28.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:28.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:18:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:29.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:18:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:18:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:18:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:30.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:18:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:18:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:18:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:18:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:31 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:18:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:18:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:31.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:18:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:32.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:32 np0005479823 nova_compute[235775]: 2025-10-10 10:18:32.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:32 np0005479823 nova_compute[235775]: 2025-10-10 10:18:32.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:32 np0005479823 nova_compute[235775]: 2025-10-10 10:18:32.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:33.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:33 np0005479823 nova_compute[235775]: 2025-10-10 10:18:33.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:33 np0005479823 podman[245690]: 2025-10-10 10:18:33.804687511 +0000 UTC m=+0.063243517 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 10 06:18:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:33 np0005479823 podman[245688]: 2025-10-10 10:18:33.836410106 +0000 UTC m=+0.098734342 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 10 06:18:33 np0005479823 podman[245689]: 2025-10-10 10:18:33.839055872 +0000 UTC m=+0.095834780 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct 10 06:18:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:34.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:35.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:18:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:18:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:18:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:18:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:36 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:18:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:18:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:36.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:18:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:18:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:37.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:18:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:37 np0005479823 nova_compute[235775]: 2025-10-10 10:18:37.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:38 np0005479823 nova_compute[235775]: 2025-10-10 10:18:38.313 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1760091503.3117325, 49a68fc9-f469-4827-9bb8-f2c2981d2b68 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct 10 06:18:38 np0005479823 nova_compute[235775]: 2025-10-10 10:18:38.314 2 INFO nova.compute.manager [-] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] VM Stopped (Lifecycle Event)#033[00m
Oct 10 06:18:38 np0005479823 nova_compute[235775]: 2025-10-10 10:18:38.341 2 DEBUG nova.compute.manager [None req-c5ab02e4-6a6d-4654-861d-36fca741e53c - - - - - -] [instance: 49a68fc9-f469-4827-9bb8-f2c2981d2b68] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct 10 06:18:38 np0005479823 nova_compute[235775]: 2025-10-10 10:18:38.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:38.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:39.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:18:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:40.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:18:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:18:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:18:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:41 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:18:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:41.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:41.473 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:18:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:41.474 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:18:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:18:41.474 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:18:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:18:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:42.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:18:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:42 np0005479823 nova_compute[235775]: 2025-10-10 10:18:42.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:43.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:43 np0005479823 nova_compute[235775]: 2025-10-10 10:18:43.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:44.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:44 np0005479823 podman[245791]: 2025-10-10 10:18:44.774125874 +0000 UTC m=+0.049686393 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 06:18:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:18:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:45.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:18:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:18:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:18:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:18:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:18:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:46 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:18:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:46.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:47.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:47 np0005479823 nova_compute[235775]: 2025-10-10 10:18:47.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:48 np0005479823 nova_compute[235775]: 2025-10-10 10:18:48.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:48.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:18:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:49.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:18:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:18:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:18:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:50.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:18:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:18:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:18:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:18:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:51 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:18:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:51.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:52.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:52 np0005479823 nova_compute[235775]: 2025-10-10 10:18:52.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:18:52 np0005479823 nova_compute[235775]: 2025-10-10 10:18:52.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:18:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:52 np0005479823 nova_compute[235775]: 2025-10-10 10:18:52.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:53.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:53 np0005479823 nova_compute[235775]: 2025-10-10 10:18:53.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:54.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:55.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:18:55 np0005479823 nova_compute[235775]: 2025-10-10 10:18:55.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:18:55 np0005479823 nova_compute[235775]: 2025-10-10 10:18:55.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct 10 06:18:55 np0005479823 nova_compute[235775]: 2025-10-10 10:18:55.816 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct 10 06:18:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:55 np0005479823 nova_compute[235775]: 2025-10-10 10:18:55.844 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct 10 06:18:55 np0005479823 nova_compute[235775]: 2025-10-10 10:18:55.844 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:18:55 np0005479823 nova_compute[235775]: 2025-10-10 10:18:55.845 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:18:55 np0005479823 nova_compute[235775]: 2025-10-10 10:18:55.875 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:18:55 np0005479823 nova_compute[235775]: 2025-10-10 10:18:55.875 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:18:55 np0005479823 nova_compute[235775]: 2025-10-10 10:18:55.876 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:18:55 np0005479823 nova_compute[235775]: 2025-10-10 10:18:55.876 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct 10 06:18:55 np0005479823 nova_compute[235775]: 2025-10-10 10:18:55.876 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:18:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:18:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:18:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:18:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:18:56 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:18:56 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:18:56 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/475021902' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:18:56 np0005479823 nova_compute[235775]: 2025-10-10 10:18:56.337 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:18:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:18:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:56.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:18:56 np0005479823 nova_compute[235775]: 2025-10-10 10:18:56.542 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct 10 06:18:56 np0005479823 nova_compute[235775]: 2025-10-10 10:18:56.544 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4900MB free_disk=59.967525482177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct 10 06:18:56 np0005479823 nova_compute[235775]: 2025-10-10 10:18:56.544 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:18:56 np0005479823 nova_compute[235775]: 2025-10-10 10:18:56.545 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:18:56 np0005479823 nova_compute[235775]: 2025-10-10 10:18:56.627 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct 10 06:18:56 np0005479823 nova_compute[235775]: 2025-10-10 10:18:56.628 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct 10 06:18:56 np0005479823 nova_compute[235775]: 2025-10-10 10:18:56.644 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:18:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:57 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:18:57 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3962479693' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:18:57 np0005479823 nova_compute[235775]: 2025-10-10 10:18:57.073 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:18:57 np0005479823 nova_compute[235775]: 2025-10-10 10:18:57.078 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct 10 06:18:57 np0005479823 nova_compute[235775]: 2025-10-10 10:18:57.105 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct 10 06:18:57 np0005479823 nova_compute[235775]: 2025-10-10 10:18:57.129 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct 10 06:18:57 np0005479823 nova_compute[235775]: 2025-10-10 10:18:57.129 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.585s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:18:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:57.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:57 np0005479823 nova_compute[235775]: 2025-10-10 10:18:57.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:58 np0005479823 nova_compute[235775]: 2025-10-10 10:18:58.099 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:18:58 np0005479823 nova_compute[235775]: 2025-10-10 10:18:58.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:18:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:18:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:18:58.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:18:58 np0005479823 nova_compute[235775]: 2025-10-10 10:18:58.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:18:58 np0005479823 nova_compute[235775]: 2025-10-10 10:18:58.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:18:58 np0005479823 nova_compute[235775]: 2025-10-10 10:18:58.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct 10 06:18:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:18:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:18:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:18:59.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:18:59 np0005479823 nova_compute[235775]: 2025-10-10 10:18:59.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:18:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:18:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:18:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:18:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:19:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:00.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:19:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:19:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:19:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:01 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:19:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:19:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:01.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:19:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:02.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:02 np0005479823 nova_compute[235775]: 2025-10-10 10:19:02.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:19:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:03.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:19:03 np0005479823 nova_compute[235775]: 2025-10-10 10:19:03.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:03 np0005479823 podman[246021]: 2025-10-10 10:19:03.720351498 +0000 UTC m=+0.053309608 container exec bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325)
Oct 10 06:19:03 np0005479823 podman[246021]: 2025-10-10 10:19:03.812186919 +0000 UTC m=+0.145145009 container exec_died bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, ceph=True)
Oct 10 06:19:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:03 np0005479823 podman[246055]: 2025-10-10 10:19:03.942642906 +0000 UTC m=+0.060530570 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 10 06:19:03 np0005479823 podman[246058]: 2025-10-10 10:19:03.954734123 +0000 UTC m=+0.075423186 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct 10 06:19:04 np0005479823 podman[246060]: 2025-10-10 10:19:04.009962252 +0000 UTC m=+0.129763776 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 06:19:04 np0005479823 podman[246203]: 2025-10-10 10:19:04.238444407 +0000 UTC m=+0.052002606 container exec 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 10 06:19:04 np0005479823 podman[246203]: 2025-10-10 10:19:04.269127899 +0000 UTC m=+0.082686098 container exec_died 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 10 06:19:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:04.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:04 np0005479823 podman[246293]: 2025-10-10 10:19:04.577951098 +0000 UTC m=+0.052517944 container exec eac346131ad153d129d5755e1377a2007627c03598a265a99b9e06d18355c13f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1)
Oct 10 06:19:04 np0005479823 podman[246293]: 2025-10-10 10:19:04.58615481 +0000 UTC m=+0.060721656 container exec_died eac346131ad153d129d5755e1377a2007627c03598a265a99b9e06d18355c13f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Oct 10 06:19:04 np0005479823 podman[246357]: 2025-10-10 10:19:04.765348758 +0000 UTC m=+0.047490922 container exec 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0 (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol)
Oct 10 06:19:04 np0005479823 podman[246357]: 2025-10-10 10:19:04.774104278 +0000 UTC m=+0.056246342 container exec_died 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0 (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol)
Oct 10 06:19:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:04 np0005479823 podman[246423]: 2025-10-10 10:19:04.994547286 +0000 UTC m=+0.071845421 container exec 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, architecture=x86_64, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=, name=keepalived, release=1793, version=2.2.4, io.buildah.version=1.28.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Oct 10 06:19:05 np0005479823 podman[246423]: 2025-10-10 10:19:05.01531154 +0000 UTC m=+0.092609655 container exec_died 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, release=1793, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, distribution-scope=public, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, architecture=x86_64, name=keepalived, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git)
Oct 10 06:19:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:19:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:05.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:19:05 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:19:05 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:19:05 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Oct 10 06:19:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:19:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:19:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:19:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:19:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:06 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:19:06 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:19:06 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:19:06 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:19:06 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:19:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:06.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:07.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:07 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:19:07 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:19:07 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Oct 10 06:19:07 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:19:07 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:19:07 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Oct 10 06:19:07 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 06:19:07 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:19:07 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:19:07 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 06:19:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:07 np0005479823 nova_compute[235775]: 2025-10-10 10:19:07.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:07 np0005479823 ovn_controller[132503]: 2025-10-10T10:19:07Z|00046|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Oct 10 06:19:08 np0005479823 nova_compute[235775]: 2025-10-10 10:19:08.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:08.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:09.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:19:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:10.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:19:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:19:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:19:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:11 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:19:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:11.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:12 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:19:12 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:19:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:12.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:12 np0005479823 nova_compute[235775]: 2025-10-10 10:19:12.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:13.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:13 np0005479823 nova_compute[235775]: 2025-10-10 10:19:13.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:14.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:15.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:19:15 np0005479823 podman[246680]: 2025-10-10 10:19:15.814778919 +0000 UTC m=+0.073590448 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 06:19:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:19:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:19:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:19:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:16 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:19:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:16.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:19:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:17.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:19:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:17 np0005479823 nova_compute[235775]: 2025-10-10 10:19:17.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:18 np0005479823 nova_compute[235775]: 2025-10-10 10:19:18.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:19:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:18.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:19:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:19:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:19.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:19:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:19:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:19:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:20.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:19:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:19:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:19:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:19:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:21 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:19:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:21.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:22.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:22 np0005479823 nova_compute[235775]: 2025-10-10 10:19:22.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:22 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:22.533 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'da:dc:6a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '42:2f:dd:4e:d8:41'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct 10 06:19:22 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:22.536 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct 10 06:19:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:22 np0005479823 nova_compute[235775]: 2025-10-10 10:19:22.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000065s ======
Oct 10 06:19:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:23.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000065s
Oct 10 06:19:23 np0005479823 nova_compute[235775]: 2025-10-10 10:19:23.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:24.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:25.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:19:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:19:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:19:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:19:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:26 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:19:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:26.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:27.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:27 np0005479823 nova_compute[235775]: 2025-10-10 10:19:27.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:28 np0005479823 nova_compute[235775]: 2025-10-10 10:19:28.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:28.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:28 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:28.538 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:19:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:19:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:29.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:19:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:19:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:30.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:19:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:19:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:19:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:31 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:19:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:19:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:31.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:19:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:19:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:32.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:19:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:32 np0005479823 nova_compute[235775]: 2025-10-10 10:19:32.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:19:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:33.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:19:33 np0005479823 nova_compute[235775]: 2025-10-10 10:19:33.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:34.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:34 np0005479823 podman[246745]: 2025-10-10 10:19:34.817279583 +0000 UTC m=+0.085274471 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 10 06:19:34 np0005479823 podman[246746]: 2025-10-10 10:19:34.821999994 +0000 UTC m=+0.088443293 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid)
Oct 10 06:19:34 np0005479823 podman[246744]: 2025-10-10 10:19:34.823703449 +0000 UTC m=+0.095053005 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 10 06:19:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:35.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:19:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:36 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:19:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:36 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:19:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:36 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:19:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:36 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:19:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:36.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:37.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:37 np0005479823 nova_compute[235775]: 2025-10-10 10:19:37.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:38 np0005479823 nova_compute[235775]: 2025-10-10 10:19:38.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:38.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:19:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:39.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:19:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:19:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:40.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:19:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:19:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:19:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:41 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:19:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:41.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:41.474 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:19:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:41.474 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:19:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:41.474 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:19:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:42.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:42 np0005479823 nova_compute[235775]: 2025-10-10 10:19:42.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:43.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:43 np0005479823 nova_compute[235775]: 2025-10-10 10:19:43.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:19:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:44.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:19:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:45.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:19:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:19:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:19:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:19:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:46 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:19:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:46.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:46 np0005479823 nova_compute[235775]: 2025-10-10 10:19:46.770 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "4fd38b02-f79c-4eb5-9939-6939dda28a15" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:19:46 np0005479823 nova_compute[235775]: 2025-10-10 10:19:46.770 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:19:46 np0005479823 podman[246841]: 2025-10-10 10:19:46.783635366 +0000 UTC m=+0.056637975 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 10 06:19:46 np0005479823 nova_compute[235775]: 2025-10-10 10:19:46.787 2 DEBUG nova.compute.manager [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct 10 06:19:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:46 np0005479823 nova_compute[235775]: 2025-10-10 10:19:46.869 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:19:46 np0005479823 nova_compute[235775]: 2025-10-10 10:19:46.870 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:19:46 np0005479823 nova_compute[235775]: 2025-10-10 10:19:46.876 2 DEBUG nova.virt.hardware [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct 10 06:19:46 np0005479823 nova_compute[235775]: 2025-10-10 10:19:46.877 2 INFO nova.compute.claims [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct 10 06:19:46 np0005479823 nova_compute[235775]: 2025-10-10 10:19:46.973 2 DEBUG oslo_concurrency.processutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:19:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:47.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:47 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:19:47 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2667140307' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:19:47 np0005479823 nova_compute[235775]: 2025-10-10 10:19:47.410 2 DEBUG oslo_concurrency.processutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:19:47 np0005479823 nova_compute[235775]: 2025-10-10 10:19:47.416 2 DEBUG nova.compute.provider_tree [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct 10 06:19:47 np0005479823 nova_compute[235775]: 2025-10-10 10:19:47.436 2 DEBUG nova.scheduler.client.report [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct 10 06:19:47 np0005479823 nova_compute[235775]: 2025-10-10 10:19:47.474 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.604s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:19:47 np0005479823 nova_compute[235775]: 2025-10-10 10:19:47.474 2 DEBUG nova.compute.manager [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct 10 06:19:47 np0005479823 nova_compute[235775]: 2025-10-10 10:19:47.549 2 DEBUG nova.compute.manager [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct 10 06:19:47 np0005479823 nova_compute[235775]: 2025-10-10 10:19:47.550 2 DEBUG nova.network.neutron [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct 10 06:19:47 np0005479823 nova_compute[235775]: 2025-10-10 10:19:47.579 2 INFO nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct 10 06:19:47 np0005479823 nova_compute[235775]: 2025-10-10 10:19:47.605 2 DEBUG nova.compute.manager [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct 10 06:19:47 np0005479823 nova_compute[235775]: 2025-10-10 10:19:47.734 2 DEBUG nova.compute.manager [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct 10 06:19:47 np0005479823 nova_compute[235775]: 2025-10-10 10:19:47.735 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct 10 06:19:47 np0005479823 nova_compute[235775]: 2025-10-10 10:19:47.736 2 INFO nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Creating image(s)#033[00m
Oct 10 06:19:47 np0005479823 nova_compute[235775]: 2025-10-10 10:19:47.757 2 DEBUG nova.storage.rbd_utils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image 4fd38b02-f79c-4eb5-9939-6939dda28a15_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct 10 06:19:47 np0005479823 nova_compute[235775]: 2025-10-10 10:19:47.785 2 DEBUG nova.storage.rbd_utils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image 4fd38b02-f79c-4eb5-9939-6939dda28a15_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct 10 06:19:47 np0005479823 nova_compute[235775]: 2025-10-10 10:19:47.809 2 DEBUG nova.storage.rbd_utils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image 4fd38b02-f79c-4eb5-9939-6939dda28a15_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct 10 06:19:47 np0005479823 nova_compute[235775]: 2025-10-10 10:19:47.813 2 DEBUG oslo_concurrency.processutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:19:47 np0005479823 nova_compute[235775]: 2025-10-10 10:19:47.838 2 DEBUG nova.policy [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7956778c03764aaf8906c9b435337976', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd5e531d4b440422d946eaf6fd4e166f7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct 10 06:19:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:47 np0005479823 nova_compute[235775]: 2025-10-10 10:19:47.885 2 DEBUG oslo_concurrency.processutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:19:47 np0005479823 nova_compute[235775]: 2025-10-10 10:19:47.885 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "eec5fe2328f977d3b1a385313e521aef425c0ac1" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:19:47 np0005479823 nova_compute[235775]: 2025-10-10 10:19:47.886 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "eec5fe2328f977d3b1a385313e521aef425c0ac1" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:19:47 np0005479823 nova_compute[235775]: 2025-10-10 10:19:47.886 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "eec5fe2328f977d3b1a385313e521aef425c0ac1" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:19:47 np0005479823 nova_compute[235775]: 2025-10-10 10:19:47.908 2 DEBUG nova.storage.rbd_utils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image 4fd38b02-f79c-4eb5-9939-6939dda28a15_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct 10 06:19:47 np0005479823 nova_compute[235775]: 2025-10-10 10:19:47.912 2 DEBUG oslo_concurrency.processutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1 4fd38b02-f79c-4eb5-9939-6939dda28a15_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:19:47 np0005479823 nova_compute[235775]: 2025-10-10 10:19:47.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:48 np0005479823 nova_compute[235775]: 2025-10-10 10:19:48.132 2 DEBUG oslo_concurrency.processutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/eec5fe2328f977d3b1a385313e521aef425c0ac1 4fd38b02-f79c-4eb5-9939-6939dda28a15_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.220s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:19:48 np0005479823 nova_compute[235775]: 2025-10-10 10:19:48.188 2 DEBUG nova.storage.rbd_utils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] resizing rbd image 4fd38b02-f79c-4eb5-9939-6939dda28a15_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Oct 10 06:19:48 np0005479823 nova_compute[235775]: 2025-10-10 10:19:48.276 2 DEBUG nova.objects.instance [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lazy-loading 'migration_context' on Instance uuid 4fd38b02-f79c-4eb5-9939-6939dda28a15 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct 10 06:19:48 np0005479823 nova_compute[235775]: 2025-10-10 10:19:48.289 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct 10 06:19:48 np0005479823 nova_compute[235775]: 2025-10-10 10:19:48.290 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Ensure instance console log exists: /var/lib/nova/instances/4fd38b02-f79c-4eb5-9939-6939dda28a15/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct 10 06:19:48 np0005479823 nova_compute[235775]: 2025-10-10 10:19:48.290 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:19:48 np0005479823 nova_compute[235775]: 2025-10-10 10:19:48.290 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:19:48 np0005479823 nova_compute[235775]: 2025-10-10 10:19:48.290 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:19:48 np0005479823 nova_compute[235775]: 2025-10-10 10:19:48.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:48 np0005479823 nova_compute[235775]: 2025-10-10 10:19:48.431 2 DEBUG nova.network.neutron [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Successfully created port: 7369f952-1f44-445c-9449-347d6d476d79 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct 10 06:19:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:48.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:49.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:49 np0005479823 nova_compute[235775]: 2025-10-10 10:19:49.357 2 DEBUG nova.network.neutron [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Successfully updated port: 7369f952-1f44-445c-9449-347d6d476d79 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct 10 06:19:49 np0005479823 nova_compute[235775]: 2025-10-10 10:19:49.378 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct 10 06:19:49 np0005479823 nova_compute[235775]: 2025-10-10 10:19:49.379 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquired lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct 10 06:19:49 np0005479823 nova_compute[235775]: 2025-10-10 10:19:49.379 2 DEBUG nova.network.neutron [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct 10 06:19:49 np0005479823 nova_compute[235775]: 2025-10-10 10:19:49.438 2 DEBUG nova.compute.manager [req-f5013e3c-028b-47f1-ae96-76b904aaef00 req-5900f698-2a42-4adf-84e6-02fc3eb46069 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-changed-7369f952-1f44-445c-9449-347d6d476d79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct 10 06:19:49 np0005479823 nova_compute[235775]: 2025-10-10 10:19:49.438 2 DEBUG nova.compute.manager [req-f5013e3c-028b-47f1-ae96-76b904aaef00 req-5900f698-2a42-4adf-84e6-02fc3eb46069 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Refreshing instance network info cache due to event network-changed-7369f952-1f44-445c-9449-347d6d476d79. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct 10 06:19:49 np0005479823 nova_compute[235775]: 2025-10-10 10:19:49.439 2 DEBUG oslo_concurrency.lockutils [req-f5013e3c-028b-47f1-ae96-76b904aaef00 req-5900f698-2a42-4adf-84e6-02fc3eb46069 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct 10 06:19:49 np0005479823 nova_compute[235775]: 2025-10-10 10:19:49.533 2 DEBUG nova.network.neutron [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct 10 06:19:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:19:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:50.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:50 np0005479823 nova_compute[235775]: 2025-10-10 10:19:50.592 2 DEBUG nova.network.neutron [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Updating instance_info_cache with network_info: [{"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct 10 06:19:50 np0005479823 nova_compute[235775]: 2025-10-10 10:19:50.615 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Releasing lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct 10 06:19:50 np0005479823 nova_compute[235775]: 2025-10-10 10:19:50.615 2 DEBUG nova.compute.manager [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Instance network_info: |[{"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct 10 06:19:50 np0005479823 nova_compute[235775]: 2025-10-10 10:19:50.616 2 DEBUG oslo_concurrency.lockutils [req-f5013e3c-028b-47f1-ae96-76b904aaef00 req-5900f698-2a42-4adf-84e6-02fc3eb46069 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquired lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct 10 06:19:50 np0005479823 nova_compute[235775]: 2025-10-10 10:19:50.617 2 DEBUG nova.network.neutron [req-f5013e3c-028b-47f1-ae96-76b904aaef00 req-5900f698-2a42-4adf-84e6-02fc3eb46069 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Refreshing network info cache for port 7369f952-1f44-445c-9449-347d6d476d79 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct 10 06:19:50 np0005479823 nova_compute[235775]: 2025-10-10 10:19:50.622 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Start _get_guest_xml network_info=[{"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-10T10:09:50Z,direct_url=<?>,disk_format='qcow2',id=5ae78700-970d-45b4-a57d-978a054c7519,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ec962e275689437d80680ff3ea69c852',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-10T10:09:53Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'size': 0, 'device_type': 'disk', 'guest_format': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_secret_uuid': None, 'encrypted': False, 'encryption_format': None, 'encryption_options': None, 'image_id': '5ae78700-970d-45b4-a57d-978a054c7519'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct 10 06:19:50 np0005479823 nova_compute[235775]: 2025-10-10 10:19:50.628 2 WARNING nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct 10 06:19:50 np0005479823 nova_compute[235775]: 2025-10-10 10:19:50.633 2 DEBUG nova.virt.libvirt.host [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct 10 06:19:50 np0005479823 nova_compute[235775]: 2025-10-10 10:19:50.634 2 DEBUG nova.virt.libvirt.host [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct 10 06:19:50 np0005479823 nova_compute[235775]: 2025-10-10 10:19:50.648 2 DEBUG nova.virt.libvirt.host [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct 10 06:19:50 np0005479823 nova_compute[235775]: 2025-10-10 10:19:50.649 2 DEBUG nova.virt.libvirt.host [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct 10 06:19:50 np0005479823 nova_compute[235775]: 2025-10-10 10:19:50.649 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct 10 06:19:50 np0005479823 nova_compute[235775]: 2025-10-10 10:19:50.650 2 DEBUG nova.virt.hardware [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-10T10:09:48Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='00373e71-6208-4238-ad85-db0452c53bc6',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-10T10:09:50Z,direct_url=<?>,disk_format='qcow2',id=5ae78700-970d-45b4-a57d-978a054c7519,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ec962e275689437d80680ff3ea69c852',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-10T10:09:53Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct 10 06:19:50 np0005479823 nova_compute[235775]: 2025-10-10 10:19:50.651 2 DEBUG nova.virt.hardware [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct 10 06:19:50 np0005479823 nova_compute[235775]: 2025-10-10 10:19:50.651 2 DEBUG nova.virt.hardware [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct 10 06:19:50 np0005479823 nova_compute[235775]: 2025-10-10 10:19:50.652 2 DEBUG nova.virt.hardware [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct 10 06:19:50 np0005479823 nova_compute[235775]: 2025-10-10 10:19:50.652 2 DEBUG nova.virt.hardware [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct 10 06:19:50 np0005479823 nova_compute[235775]: 2025-10-10 10:19:50.653 2 DEBUG nova.virt.hardware [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct 10 06:19:50 np0005479823 nova_compute[235775]: 2025-10-10 10:19:50.653 2 DEBUG nova.virt.hardware [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct 10 06:19:50 np0005479823 nova_compute[235775]: 2025-10-10 10:19:50.654 2 DEBUG nova.virt.hardware [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct 10 06:19:50 np0005479823 nova_compute[235775]: 2025-10-10 10:19:50.654 2 DEBUG nova.virt.hardware [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct 10 06:19:50 np0005479823 nova_compute[235775]: 2025-10-10 10:19:50.655 2 DEBUG nova.virt.hardware [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct 10 06:19:50 np0005479823 nova_compute[235775]: 2025-10-10 10:19:50.655 2 DEBUG nova.virt.hardware [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct 10 06:19:50 np0005479823 nova_compute[235775]: 2025-10-10 10:19:50.659 2 DEBUG oslo_concurrency.processutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:19:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:19:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:19:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:19:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:51 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:19:51 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Oct 10 06:19:51 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4063882823' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.148 2 DEBUG oslo_concurrency.processutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.176 2 DEBUG nova.storage.rbd_utils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image 4fd38b02-f79c-4eb5-9939-6939dda28a15_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.180 2 DEBUG oslo_concurrency.processutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:19:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:51.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:51 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Oct 10 06:19:51 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1076408673' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.618 2 DEBUG oslo_concurrency.processutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.619 2 DEBUG nova.virt.libvirt.vif [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-10T10:19:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-900143833',display_name='tempest-TestNetworkBasicOps-server-900143833',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-900143833',id=11,image_ref='5ae78700-970d-45b4-a57d-978a054c7519',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJQ6vYp+U8d7Yiink0K/iQNUjrLla5VjGnuqrTVtw+u6eTZg4qjU5w1TFNoLgk+EE3EJPtqEojXIPj0UMRCIST/kkZjRsWCJV3t0ho4U419OoM2lVk7/JJmPOAXOx5ZoVg==',key_name='tempest-TestNetworkBasicOps-780402283',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d5e531d4b440422d946eaf6fd4e166f7',ramdisk_id='',reservation_id='r-v13j2ta3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5ae78700-970d-45b4-a57d-978a054c7519',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-188749107',owner_user_name='tempest-TestNetworkBasicOps-188749107-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-10T10:19:47Z,user_data=None,user_id='7956778c03764aaf8906c9b435337976',uuid=4fd38b02-f79c-4eb5-9939-6939dda28a15,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.619 2 DEBUG nova.network.os_vif_util [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converting VIF {"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.620 2 DEBUG nova.network.os_vif_util [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:86:3b,bridge_name='br-int',has_traffic_filtering=True,id=7369f952-1f44-445c-9449-347d6d476d79,network=Network(fb3e50c5-fe48-4113-87d7-4e11945ac752),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7369f952-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.621 2 DEBUG nova.objects.instance [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4fd38b02-f79c-4eb5-9939-6939dda28a15 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.641 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] End _get_guest_xml xml=<domain type="kvm">
Oct 10 06:19:51 np0005479823 nova_compute[235775]:  <uuid>4fd38b02-f79c-4eb5-9939-6939dda28a15</uuid>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:  <name>instance-0000000b</name>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:  <memory>131072</memory>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:  <vcpu>1</vcpu>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:  <metadata>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <nova:name>tempest-TestNetworkBasicOps-server-900143833</nova:name>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <nova:creationTime>2025-10-10 10:19:50</nova:creationTime>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <nova:flavor name="m1.nano">
Oct 10 06:19:51 np0005479823 nova_compute[235775]:        <nova:memory>128</nova:memory>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:        <nova:disk>1</nova:disk>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:        <nova:swap>0</nova:swap>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:        <nova:ephemeral>0</nova:ephemeral>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:        <nova:vcpus>1</nova:vcpus>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      </nova:flavor>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <nova:owner>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:        <nova:user uuid="7956778c03764aaf8906c9b435337976">tempest-TestNetworkBasicOps-188749107-project-member</nova:user>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:        <nova:project uuid="d5e531d4b440422d946eaf6fd4e166f7">tempest-TestNetworkBasicOps-188749107</nova:project>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      </nova:owner>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <nova:root type="image" uuid="5ae78700-970d-45b4-a57d-978a054c7519"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <nova:ports>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:        <nova:port uuid="7369f952-1f44-445c-9449-347d6d476d79">
Oct 10 06:19:51 np0005479823 nova_compute[235775]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:        </nova:port>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      </nova:ports>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    </nova:instance>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:  </metadata>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:  <sysinfo type="smbios">
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <system>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <entry name="manufacturer">RDO</entry>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <entry name="product">OpenStack Compute</entry>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <entry name="serial">4fd38b02-f79c-4eb5-9939-6939dda28a15</entry>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <entry name="uuid">4fd38b02-f79c-4eb5-9939-6939dda28a15</entry>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <entry name="family">Virtual Machine</entry>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    </system>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:  </sysinfo>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:  <os>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <type arch="x86_64" machine="q35">hvm</type>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <boot dev="hd"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <smbios mode="sysinfo"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:  </os>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:  <features>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <acpi/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <apic/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <vmcoreinfo/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:  </features>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:  <clock offset="utc">
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <timer name="pit" tickpolicy="delay"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <timer name="rtc" tickpolicy="catchup"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <timer name="hpet" present="no"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:  </clock>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:  <cpu mode="host-model" match="exact">
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <topology sockets="1" cores="1" threads="1"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:  </cpu>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:  <devices>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <disk type="network" device="disk">
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <driver type="raw" cache="none"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <source protocol="rbd" name="vms/4fd38b02-f79c-4eb5-9939-6939dda28a15_disk">
Oct 10 06:19:51 np0005479823 nova_compute[235775]:        <host name="192.168.122.100" port="6789"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:        <host name="192.168.122.102" port="6789"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:        <host name="192.168.122.101" port="6789"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      </source>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <auth username="openstack">
Oct 10 06:19:51 np0005479823 nova_compute[235775]:        <secret type="ceph" uuid="21f084a3-af34-5230-afe4-ea5cd24a55f4"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      </auth>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <target dev="vda" bus="virtio"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    </disk>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <disk type="network" device="cdrom">
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <driver type="raw" cache="none"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <source protocol="rbd" name="vms/4fd38b02-f79c-4eb5-9939-6939dda28a15_disk.config">
Oct 10 06:19:51 np0005479823 nova_compute[235775]:        <host name="192.168.122.100" port="6789"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:        <host name="192.168.122.102" port="6789"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:        <host name="192.168.122.101" port="6789"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      </source>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <auth username="openstack">
Oct 10 06:19:51 np0005479823 nova_compute[235775]:        <secret type="ceph" uuid="21f084a3-af34-5230-afe4-ea5cd24a55f4"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      </auth>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <target dev="sda" bus="sata"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    </disk>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <interface type="ethernet">
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <mac address="fa:16:3e:54:86:3b"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <model type="virtio"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <driver name="vhost" rx_queue_size="512"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <mtu size="1442"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <target dev="tap7369f952-1f"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    </interface>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <serial type="pty">
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <log file="/var/lib/nova/instances/4fd38b02-f79c-4eb5-9939-6939dda28a15/console.log" append="off"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    </serial>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <video>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <model type="virtio"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    </video>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <input type="tablet" bus="usb"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <rng model="virtio">
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <backend model="random">/dev/urandom</backend>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    </rng>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <controller type="pci" model="pcie-root-port"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <controller type="usb" index="0"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    <memballoon model="virtio">
Oct 10 06:19:51 np0005479823 nova_compute[235775]:      <stats period="10"/>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:    </memballoon>
Oct 10 06:19:51 np0005479823 nova_compute[235775]:  </devices>
Oct 10 06:19:51 np0005479823 nova_compute[235775]: </domain>
Oct 10 06:19:51 np0005479823 nova_compute[235775]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.643 2 DEBUG nova.compute.manager [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Preparing to wait for external event network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.643 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.643 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.643 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.644 2 DEBUG nova.virt.libvirt.vif [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-10T10:19:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-900143833',display_name='tempest-TestNetworkBasicOps-server-900143833',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-900143833',id=11,image_ref='5ae78700-970d-45b4-a57d-978a054c7519',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJQ6vYp+U8d7Yiink0K/iQNUjrLla5VjGnuqrTVtw+u6eTZg4qjU5w1TFNoLgk+EE3EJPtqEojXIPj0UMRCIST/kkZjRsWCJV3t0ho4U419OoM2lVk7/JJmPOAXOx5ZoVg==',key_name='tempest-TestNetworkBasicOps-780402283',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d5e531d4b440422d946eaf6fd4e166f7',ramdisk_id='',reservation_id='r-v13j2ta3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5ae78700-970d-45b4-a57d-978a054c7519',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-188749107',owner_user_name='tempest-TestNetworkBasicOps-188749107-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-10T10:19:47Z,user_data=None,user_id='7956778c03764aaf8906c9b435337976',uuid=4fd38b02-f79c-4eb5-9939-6939dda28a15,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.644 2 DEBUG nova.network.os_vif_util [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converting VIF {"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.644 2 DEBUG nova.network.os_vif_util [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:86:3b,bridge_name='br-int',has_traffic_filtering=True,id=7369f952-1f44-445c-9449-347d6d476d79,network=Network(fb3e50c5-fe48-4113-87d7-4e11945ac752),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7369f952-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.645 2 DEBUG os_vif [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:86:3b,bridge_name='br-int',has_traffic_filtering=True,id=7369f952-1f44-445c-9449-347d6d476d79,network=Network(fb3e50c5-fe48-4113-87d7-4e11945ac752),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7369f952-1f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.646 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.646 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.649 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7369f952-1f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.650 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7369f952-1f, col_values=(('external_ids', {'iface-id': '7369f952-1f44-445c-9449-347d6d476d79', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:54:86:3b', 'vm-uuid': '4fd38b02-f79c-4eb5-9939-6939dda28a15'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:51 np0005479823 NetworkManager[44866]: <info>  [1760091591.6527] manager: (tap7369f952-1f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.658 2 INFO os_vif [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:86:3b,bridge_name='br-int',has_traffic_filtering=True,id=7369f952-1f44-445c-9449-347d6d476d79,network=Network(fb3e50c5-fe48-4113-87d7-4e11945ac752),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7369f952-1f')#033[00m
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.704 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.705 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.705 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] No VIF found with MAC fa:16:3e:54:86:3b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.706 2 INFO nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Using config drive#033[00m
Oct 10 06:19:51 np0005479823 nova_compute[235775]: 2025-10-10 10:19:51.730 2 DEBUG nova.storage.rbd_utils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image 4fd38b02-f79c-4eb5-9939-6939dda28a15_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct 10 06:19:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:52 np0005479823 nova_compute[235775]: 2025-10-10 10:19:52.129 2 INFO nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Creating config drive at /var/lib/nova/instances/4fd38b02-f79c-4eb5-9939-6939dda28a15/disk.config#033[00m
Oct 10 06:19:52 np0005479823 nova_compute[235775]: 2025-10-10 10:19:52.138 2 DEBUG oslo_concurrency.processutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4fd38b02-f79c-4eb5-9939-6939dda28a15/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwj6ye7zz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:19:52 np0005479823 nova_compute[235775]: 2025-10-10 10:19:52.265 2 DEBUG oslo_concurrency.processutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4fd38b02-f79c-4eb5-9939-6939dda28a15/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwj6ye7zz" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:19:52 np0005479823 nova_compute[235775]: 2025-10-10 10:19:52.308 2 DEBUG nova.storage.rbd_utils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] rbd image 4fd38b02-f79c-4eb5-9939-6939dda28a15_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Oct 10 06:19:52 np0005479823 nova_compute[235775]: 2025-10-10 10:19:52.314 2 DEBUG oslo_concurrency.processutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/4fd38b02-f79c-4eb5-9939-6939dda28a15/disk.config 4fd38b02-f79c-4eb5-9939-6939dda28a15_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:19:52 np0005479823 nova_compute[235775]: 2025-10-10 10:19:52.431 2 DEBUG nova.network.neutron [req-f5013e3c-028b-47f1-ae96-76b904aaef00 req-5900f698-2a42-4adf-84e6-02fc3eb46069 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Updated VIF entry in instance network info cache for port 7369f952-1f44-445c-9449-347d6d476d79. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct 10 06:19:52 np0005479823 nova_compute[235775]: 2025-10-10 10:19:52.434 2 DEBUG nova.network.neutron [req-f5013e3c-028b-47f1-ae96-76b904aaef00 req-5900f698-2a42-4adf-84e6-02fc3eb46069 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Updating instance_info_cache with network_info: [{"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct 10 06:19:52 np0005479823 nova_compute[235775]: 2025-10-10 10:19:52.454 2 DEBUG oslo_concurrency.lockutils [req-f5013e3c-028b-47f1-ae96-76b904aaef00 req-5900f698-2a42-4adf-84e6-02fc3eb46069 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Releasing lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct 10 06:19:52 np0005479823 nova_compute[235775]: 2025-10-10 10:19:52.499 2 DEBUG oslo_concurrency.processutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/4fd38b02-f79c-4eb5-9939-6939dda28a15/disk.config 4fd38b02-f79c-4eb5-9939-6939dda28a15_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.185s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:19:52 np0005479823 nova_compute[235775]: 2025-10-10 10:19:52.500 2 INFO nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Deleting local config drive /var/lib/nova/instances/4fd38b02-f79c-4eb5-9939-6939dda28a15/disk.config because it was imported into RBD.#033[00m
Oct 10 06:19:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:52.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:52 np0005479823 kernel: tap7369f952-1f: entered promiscuous mode
Oct 10 06:19:52 np0005479823 NetworkManager[44866]: <info>  [1760091592.5602] manager: (tap7369f952-1f): new Tun device (/org/freedesktop/NetworkManager/Devices/40)
Oct 10 06:19:52 np0005479823 nova_compute[235775]: 2025-10-10 10:19:52.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:52 np0005479823 ovn_controller[132503]: 2025-10-10T10:19:52Z|00047|binding|INFO|Claiming lport 7369f952-1f44-445c-9449-347d6d476d79 for this chassis.
Oct 10 06:19:52 np0005479823 ovn_controller[132503]: 2025-10-10T10:19:52Z|00048|binding|INFO|7369f952-1f44-445c-9449-347d6d476d79: Claiming fa:16:3e:54:86:3b 10.100.0.5
Oct 10 06:19:52 np0005479823 nova_compute[235775]: 2025-10-10 10:19:52.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.580 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:86:3b 10.100.0.5'], port_security=['fa:16:3e:54:86:3b 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '4fd38b02-f79c-4eb5-9939-6939dda28a15', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fb3e50c5-fe48-4113-87d7-4e11945ac752', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5e531d4b440422d946eaf6fd4e166f7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '26e88f36-7c05-4376-877b-78cbbe604817', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc36a9e4-a12c-4b9d-8968-49f72bde3476, chassis=[<ovs.db.idl.Row object at 0x7fa4a23cd910>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa4a23cd910>], logical_port=7369f952-1f44-445c-9449-347d6d476d79) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.581 141795 INFO neutron.agent.ovn.metadata.agent [-] Port 7369f952-1f44-445c-9449-347d6d476d79 in datapath fb3e50c5-fe48-4113-87d7-4e11945ac752 bound to our chassis#033[00m
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.583 141795 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fb3e50c5-fe48-4113-87d7-4e11945ac752#033[00m
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.595 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[a0a5209d-f839-4c54-81c9-82e770fef56f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.596 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfb3e50c5-f1 in ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.599 241439 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfb3e50c5-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.599 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[b428f4ac-b291-4d0b-85b0-78afdce8601d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.600 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[40656614-5900-4c3b-8a63-141b3db6bc29]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:19:52 np0005479823 systemd-udevd[247191]: Network interface NamePolicy= disabled on kernel command line.
Oct 10 06:19:52 np0005479823 systemd-machined[192768]: New machine qemu-3-instance-0000000b.
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.617 141908 DEBUG oslo.privsep.daemon [-] privsep: reply[f19818bc-6889-4b75-a21a-5ccfa8fea535]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:19:52 np0005479823 NetworkManager[44866]: <info>  [1760091592.6319] device (tap7369f952-1f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 10 06:19:52 np0005479823 NetworkManager[44866]: <info>  [1760091592.6329] device (tap7369f952-1f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 10 06:19:52 np0005479823 systemd[1]: Started Virtual Machine qemu-3-instance-0000000b.
Oct 10 06:19:52 np0005479823 nova_compute[235775]: 2025-10-10 10:19:52.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.647 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[d41a17c2-2079-4b32-8fe8-6c553e1d97ea]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:19:52 np0005479823 ovn_controller[132503]: 2025-10-10T10:19:52Z|00049|binding|INFO|Setting lport 7369f952-1f44-445c-9449-347d6d476d79 ovn-installed in OVS
Oct 10 06:19:52 np0005479823 ovn_controller[132503]: 2025-10-10T10:19:52Z|00050|binding|INFO|Setting lport 7369f952-1f44-445c-9449-347d6d476d79 up in Southbound
Oct 10 06:19:52 np0005479823 nova_compute[235775]: 2025-10-10 10:19:52.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.682 241521 DEBUG oslo.privsep.daemon [-] privsep: reply[00057ab0-6ced-44cf-901c-e736930d4965]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:19:52 np0005479823 NetworkManager[44866]: <info>  [1760091592.6889] manager: (tapfb3e50c5-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/41)
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.689 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[b302f714-7a70-42fb-af8d-2d611b66b34f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.715 241521 DEBUG oslo.privsep.daemon [-] privsep: reply[3bee4312-9e2f-4e71-b973-115af44bdada]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.718 241521 DEBUG oslo.privsep.daemon [-] privsep: reply[8e6eccf4-9045-412b-abd6-19b384038d7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:19:52 np0005479823 NetworkManager[44866]: <info>  [1760091592.7449] device (tapfb3e50c5-f0): carrier: link connected
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.749 241521 DEBUG oslo.privsep.daemon [-] privsep: reply[483e18c8-7084-4dea-9d2b-46fdd9a0f8c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.764 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[e6c009c4-838f-428d-a322-4d5c739aab2c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfb3e50c5-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:c3:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449630, 'reachable_time': 40556, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247224, 'error': None, 'target': 'ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.783 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[4360f4ee-37f7-4d15-9dfe-c24f456516a1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe36:c3b9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 449630, 'tstamp': 449630}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 247226, 'error': None, 'target': 'ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.801 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[f9e26675-d92c-4d42-b87c-65eb1a389e9b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfb3e50c5-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:c3:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449630, 'reachable_time': 40556, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 247227, 'error': None, 'target': 'ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:19:52 np0005479823 nova_compute[235775]: 2025-10-10 10:19:52.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.841 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[4615adff-cb38-4d11-9d41-e92cc98e31eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:19:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.915 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[47c9888a-7067-420d-9416-027c9225bb9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.916 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfb3e50c5-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.917 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.918 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfb3e50c5-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:19:52 np0005479823 NetworkManager[44866]: <info>  [1760091592.9207] manager: (tapfb3e50c5-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Oct 10 06:19:52 np0005479823 kernel: tapfb3e50c5-f0: entered promiscuous mode
Oct 10 06:19:52 np0005479823 nova_compute[235775]: 2025-10-10 10:19:52.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.931 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfb3e50c5-f0, col_values=(('external_ids', {'iface-id': '50744b55-fb9e-4bc1-a3e6-4ad27846c672'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:19:52 np0005479823 nova_compute[235775]: 2025-10-10 10:19:52.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:52 np0005479823 nova_compute[235775]: 2025-10-10 10:19:52.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:52 np0005479823 ovn_controller[132503]: 2025-10-10T10:19:52Z|00051|binding|INFO|Releasing lport 50744b55-fb9e-4bc1-a3e6-4ad27846c672 from this chassis (sb_readonly=0)
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.935 141795 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fb3e50c5-fe48-4113-87d7-4e11945ac752.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fb3e50c5-fe48-4113-87d7-4e11945ac752.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.938 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[8f0df315-2f83-4249-ad8c-d81ae6411d7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.939 141795 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: global
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]:    log         /dev/log local0 debug
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]:    log-tag     haproxy-metadata-proxy-fb3e50c5-fe48-4113-87d7-4e11945ac752
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]:    user        root
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]:    group       root
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]:    maxconn     1024
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]:    pidfile     /var/lib/neutron/external/pids/fb3e50c5-fe48-4113-87d7-4e11945ac752.pid.haproxy
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]:    daemon
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: defaults
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]:    log global
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]:    mode http
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]:    option httplog
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]:    option dontlognull
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]:    option http-server-close
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]:    option forwardfor
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]:    retries                 3
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]:    timeout http-request    30s
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]:    timeout connect         30s
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]:    timeout client          32s
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]:    timeout server          32s
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]:    timeout http-keep-alive 30s
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: listen listener
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]:    bind 169.254.169.254:80
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]:    server metadata /var/lib/neutron/metadata_proxy
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]:    http-request add-header X-OVN-Network-ID fb3e50c5-fe48-4113-87d7-4e11945ac752
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct 10 06:19:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:19:52.940 141795 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752', 'env', 'PROCESS_TAG=haproxy-fb3e50c5-fe48-4113-87d7-4e11945ac752', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fb3e50c5-fe48-4113-87d7-4e11945ac752.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct 10 06:19:52 np0005479823 nova_compute[235775]: 2025-10-10 10:19:52.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:52 np0005479823 nova_compute[235775]: 2025-10-10 10:19:52.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.046 2 DEBUG nova.compute.manager [req-d6edb4e0-18b2-4133-9947-1fb8b4178ed4 req-4de2b525-c328-48e0-b430-e427f46aee5a 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.046 2 DEBUG oslo_concurrency.lockutils [req-d6edb4e0-18b2-4133-9947-1fb8b4178ed4 req-4de2b525-c328-48e0-b430-e427f46aee5a 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.047 2 DEBUG oslo_concurrency.lockutils [req-d6edb4e0-18b2-4133-9947-1fb8b4178ed4 req-4de2b525-c328-48e0-b430-e427f46aee5a 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.047 2 DEBUG oslo_concurrency.lockutils [req-d6edb4e0-18b2-4133-9947-1fb8b4178ed4 req-4de2b525-c328-48e0-b430-e427f46aee5a 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.047 2 DEBUG nova.compute.manager [req-d6edb4e0-18b2-4133-9947-1fb8b4178ed4 req-4de2b525-c328-48e0-b430-e427f46aee5a 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Processing event network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct 10 06:19:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:19:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:53.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:19:53 np0005479823 podman[247302]: 2025-10-10 10:19:53.316491556 +0000 UTC m=+0.045393463 container create c17bc6d04c5c8d9b08bd4a6519f32603e41d5aeccea881c08f4a084722fd0be3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct 10 06:19:53 np0005479823 systemd[1]: Started libpod-conmon-c17bc6d04c5c8d9b08bd4a6519f32603e41d5aeccea881c08f4a084722fd0be3.scope.
Oct 10 06:19:53 np0005479823 podman[247302]: 2025-10-10 10:19:53.293126089 +0000 UTC m=+0.022028026 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct 10 06:19:53 np0005479823 systemd[1]: Started libcrun container.
Oct 10 06:19:53 np0005479823 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46f11d884e8a0583bed848a92a7b64922d9783a2c80338a45ab3d568340bb2fc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 10 06:19:53 np0005479823 podman[247302]: 2025-10-10 10:19:53.412325475 +0000 UTC m=+0.141227392 container init c17bc6d04c5c8d9b08bd4a6519f32603e41d5aeccea881c08f4a084722fd0be3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 10 06:19:53 np0005479823 podman[247302]: 2025-10-10 10:19:53.416909852 +0000 UTC m=+0.145811759 container start c17bc6d04c5c8d9b08bd4a6519f32603e41d5aeccea881c08f4a084722fd0be3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct 10 06:19:53 np0005479823 neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752[247318]: [NOTICE]   (247322) : New worker (247324) forked
Oct 10 06:19:53 np0005479823 neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752[247318]: [NOTICE]   (247322) : Loading success.
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.518 2 DEBUG nova.virt.driver [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Emitting event <LifecycleEvent: 1760091593.5181386, 4fd38b02-f79c-4eb5-9939-6939dda28a15 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.519 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] VM Started (Lifecycle Event)#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.520 2 DEBUG nova.compute.manager [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.523 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.526 2 INFO nova.virt.libvirt.driver [-] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Instance spawned successfully.#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.526 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.545 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.550 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.554 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.554 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.555 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.555 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.555 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.556 2 DEBUG nova.virt.libvirt.driver [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.583 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.583 2 DEBUG nova.virt.driver [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Emitting event <LifecycleEvent: 1760091593.5183418, 4fd38b02-f79c-4eb5-9939-6939dda28a15 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.584 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] VM Paused (Lifecycle Event)#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.606 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.609 2 DEBUG nova.virt.driver [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] Emitting event <LifecycleEvent: 1760091593.5228786, 4fd38b02-f79c-4eb5-9939-6939dda28a15 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.609 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] VM Resumed (Lifecycle Event)#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.611 2 INFO nova.compute.manager [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Took 5.88 seconds to spawn the instance on the hypervisor.#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.611 2 DEBUG nova.compute.manager [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.637 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.639 2 DEBUG nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.667 2 INFO nova.compute.manager [None req-211088b9-c069-40f7-abd9-bd999b03d3b4 - - - - - -] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.680 2 INFO nova.compute.manager [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Took 6.84 seconds to build instance.#033[00m
Oct 10 06:19:53 np0005479823 nova_compute[235775]: 2025-10-10 10:19:53.697 2 DEBUG oslo_concurrency.lockutils [None req-b94705dc-54db-4ec6-9c97-627d91cefa0e 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.927s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:19:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:54.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:54 np0005479823 nova_compute[235775]: 2025-10-10 10:19:54.809 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:19:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:55 np0005479823 nova_compute[235775]: 2025-10-10 10:19:55.135 2 DEBUG nova.compute.manager [req-6e16f465-0e25-4b4f-931b-d2f838709a6c req-d584470b-a392-4c25-8578-677033908356 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct 10 06:19:55 np0005479823 nova_compute[235775]: 2025-10-10 10:19:55.136 2 DEBUG oslo_concurrency.lockutils [req-6e16f465-0e25-4b4f-931b-d2f838709a6c req-d584470b-a392-4c25-8578-677033908356 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:19:55 np0005479823 nova_compute[235775]: 2025-10-10 10:19:55.136 2 DEBUG oslo_concurrency.lockutils [req-6e16f465-0e25-4b4f-931b-d2f838709a6c req-d584470b-a392-4c25-8578-677033908356 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:19:55 np0005479823 nova_compute[235775]: 2025-10-10 10:19:55.136 2 DEBUG oslo_concurrency.lockutils [req-6e16f465-0e25-4b4f-931b-d2f838709a6c req-d584470b-a392-4c25-8578-677033908356 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:19:55 np0005479823 nova_compute[235775]: 2025-10-10 10:19:55.136 2 DEBUG nova.compute.manager [req-6e16f465-0e25-4b4f-931b-d2f838709a6c req-d584470b-a392-4c25-8578-677033908356 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] No waiting events found dispatching network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct 10 06:19:55 np0005479823 nova_compute[235775]: 2025-10-10 10:19:55.136 2 WARNING nova.compute.manager [req-6e16f465-0e25-4b4f-931b-d2f838709a6c req-d584470b-a392-4c25-8578-677033908356 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received unexpected event network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 for instance with vm_state active and task_state None.#033[00m
Oct 10 06:19:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:19:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:55.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:19:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:19:55 np0005479823 ovn_controller[132503]: 2025-10-10T10:19:55Z|00052|binding|INFO|Releasing lport 50744b55-fb9e-4bc1-a3e6-4ad27846c672 from this chassis (sb_readonly=0)
Oct 10 06:19:55 np0005479823 nova_compute[235775]: 2025-10-10 10:19:55.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:55 np0005479823 NetworkManager[44866]: <info>  [1760091595.6649] manager: (patch-br-int-to-provnet-1d90fa58-74cb-4ad4-84e0-739689a69111): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Oct 10 06:19:55 np0005479823 NetworkManager[44866]: <info>  [1760091595.6663] manager: (patch-provnet-1d90fa58-74cb-4ad4-84e0-739689a69111-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Oct 10 06:19:55 np0005479823 nova_compute[235775]: 2025-10-10 10:19:55.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:55 np0005479823 ovn_controller[132503]: 2025-10-10T10:19:55Z|00053|binding|INFO|Releasing lport 50744b55-fb9e-4bc1-a3e6-4ad27846c672 from this chassis (sb_readonly=0)
Oct 10 06:19:55 np0005479823 nova_compute[235775]: 2025-10-10 10:19:55.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:55 np0005479823 nova_compute[235775]: 2025-10-10 10:19:55.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:19:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:19:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:19:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:19:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:19:56 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:19:56 np0005479823 nova_compute[235775]: 2025-10-10 10:19:56.123 2 DEBUG nova.compute.manager [req-c0a1a370-cf52-4ce3-961f-d44999a0d498 req-898dc673-3323-4d0d-be27-37bfd8831e6e 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-changed-7369f952-1f44-445c-9449-347d6d476d79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct 10 06:19:56 np0005479823 nova_compute[235775]: 2025-10-10 10:19:56.124 2 DEBUG nova.compute.manager [req-c0a1a370-cf52-4ce3-961f-d44999a0d498 req-898dc673-3323-4d0d-be27-37bfd8831e6e 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Refreshing instance network info cache due to event network-changed-7369f952-1f44-445c-9449-347d6d476d79. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct 10 06:19:56 np0005479823 nova_compute[235775]: 2025-10-10 10:19:56.124 2 DEBUG oslo_concurrency.lockutils [req-c0a1a370-cf52-4ce3-961f-d44999a0d498 req-898dc673-3323-4d0d-be27-37bfd8831e6e 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct 10 06:19:56 np0005479823 nova_compute[235775]: 2025-10-10 10:19:56.124 2 DEBUG oslo_concurrency.lockutils [req-c0a1a370-cf52-4ce3-961f-d44999a0d498 req-898dc673-3323-4d0d-be27-37bfd8831e6e 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquired lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct 10 06:19:56 np0005479823 nova_compute[235775]: 2025-10-10 10:19:56.124 2 DEBUG nova.network.neutron [req-c0a1a370-cf52-4ce3-961f-d44999a0d498 req-898dc673-3323-4d0d-be27-37bfd8831e6e 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Refreshing network info cache for port 7369f952-1f44-445c-9449-347d6d476d79 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct 10 06:19:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:56.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:56 np0005479823 nova_compute[235775]: 2025-10-10 10:19:56.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:56 np0005479823 nova_compute[235775]: 2025-10-10 10:19:56.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:19:56 np0005479823 nova_compute[235775]: 2025-10-10 10:19:56.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct 10 06:19:56 np0005479823 nova_compute[235775]: 2025-10-10 10:19:56.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct 10 06:19:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:56 np0005479823 nova_compute[235775]: 2025-10-10 10:19:56.993 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct 10 06:19:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:57.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:57 np0005479823 nova_compute[235775]: 2025-10-10 10:19:57.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:19:58 np0005479823 nova_compute[235775]: 2025-10-10 10:19:58.090 2 DEBUG nova.network.neutron [req-c0a1a370-cf52-4ce3-961f-d44999a0d498 req-898dc673-3323-4d0d-be27-37bfd8831e6e 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Updated VIF entry in instance network info cache for port 7369f952-1f44-445c-9449-347d6d476d79. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct 10 06:19:58 np0005479823 nova_compute[235775]: 2025-10-10 10:19:58.090 2 DEBUG nova.network.neutron [req-c0a1a370-cf52-4ce3-961f-d44999a0d498 req-898dc673-3323-4d0d-be27-37bfd8831e6e 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Updating instance_info_cache with network_info: [{"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct 10 06:19:58 np0005479823 nova_compute[235775]: 2025-10-10 10:19:58.116 2 DEBUG oslo_concurrency.lockutils [req-c0a1a370-cf52-4ce3-961f-d44999a0d498 req-898dc673-3323-4d0d-be27-37bfd8831e6e 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Releasing lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct 10 06:19:58 np0005479823 nova_compute[235775]: 2025-10-10 10:19:58.116 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquired lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct 10 06:19:58 np0005479823 nova_compute[235775]: 2025-10-10 10:19:58.116 2 DEBUG nova.network.neutron [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct 10 06:19:58 np0005479823 nova_compute[235775]: 2025-10-10 10:19:58.116 2 DEBUG nova.objects.instance [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4fd38b02-f79c-4eb5-9939-6939dda28a15 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct 10 06:19:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:19:58.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:19:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:19:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:19:59.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:19:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:19:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:19:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:19:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:20:00 np0005479823 ceph-mon[74913]: overall HEALTH_OK
Oct 10 06:20:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:00.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:20:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:20:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:20:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:01 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:20:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:01.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:01 np0005479823 nova_compute[235775]: 2025-10-10 10:20:01.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:02 np0005479823 nova_compute[235775]: 2025-10-10 10:20:02.205 2 DEBUG nova.network.neutron [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Updating instance_info_cache with network_info: [{"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct 10 06:20:02 np0005479823 nova_compute[235775]: 2025-10-10 10:20:02.223 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Releasing lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct 10 06:20:02 np0005479823 nova_compute[235775]: 2025-10-10 10:20:02.224 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct 10 06:20:02 np0005479823 nova_compute[235775]: 2025-10-10 10:20:02.224 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:20:02 np0005479823 nova_compute[235775]: 2025-10-10 10:20:02.224 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:20:02 np0005479823 nova_compute[235775]: 2025-10-10 10:20:02.224 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:20:02 np0005479823 nova_compute[235775]: 2025-10-10 10:20:02.225 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:20:02 np0005479823 nova_compute[235775]: 2025-10-10 10:20:02.225 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct 10 06:20:02 np0005479823 nova_compute[235775]: 2025-10-10 10:20:02.225 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:20:02 np0005479823 nova_compute[235775]: 2025-10-10 10:20:02.252 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:20:02 np0005479823 nova_compute[235775]: 2025-10-10 10:20:02.252 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:20:02 np0005479823 nova_compute[235775]: 2025-10-10 10:20:02.253 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:20:02 np0005479823 nova_compute[235775]: 2025-10-10 10:20:02.253 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct 10 06:20:02 np0005479823 nova_compute[235775]: 2025-10-10 10:20:02.253 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:20:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:02.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:02 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:20:02 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2835618541' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:20:02 np0005479823 nova_compute[235775]: 2025-10-10 10:20:02.702 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:20:02 np0005479823 nova_compute[235775]: 2025-10-10 10:20:02.799 2 DEBUG nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] skipping disk for instance-0000000b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct 10 06:20:02 np0005479823 nova_compute[235775]: 2025-10-10 10:20:02.799 2 DEBUG nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] skipping disk for instance-0000000b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Oct 10 06:20:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:02 np0005479823 nova_compute[235775]: 2025-10-10 10:20:02.965 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct 10 06:20:02 np0005479823 nova_compute[235775]: 2025-10-10 10:20:02.967 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4639MB free_disk=59.96738052368164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct 10 06:20:02 np0005479823 nova_compute[235775]: 2025-10-10 10:20:02.967 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:20:02 np0005479823 nova_compute[235775]: 2025-10-10 10:20:02.968 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:20:02 np0005479823 nova_compute[235775]: 2025-10-10 10:20:02.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:03 np0005479823 nova_compute[235775]: 2025-10-10 10:20:03.042 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Instance 4fd38b02-f79c-4eb5-9939-6939dda28a15 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct 10 06:20:03 np0005479823 nova_compute[235775]: 2025-10-10 10:20:03.043 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct 10 06:20:03 np0005479823 nova_compute[235775]: 2025-10-10 10:20:03.043 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct 10 06:20:03 np0005479823 nova_compute[235775]: 2025-10-10 10:20:03.062 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Refreshing inventories for resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct 10 06:20:03 np0005479823 nova_compute[235775]: 2025-10-10 10:20:03.087 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Updating ProviderTree inventory for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct 10 06:20:03 np0005479823 nova_compute[235775]: 2025-10-10 10:20:03.088 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Updating inventory in ProviderTree for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct 10 06:20:03 np0005479823 nova_compute[235775]: 2025-10-10 10:20:03.106 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Refreshing aggregate associations for resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct 10 06:20:03 np0005479823 nova_compute[235775]: 2025-10-10 10:20:03.130 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Refreshing trait associations for resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0, traits: HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_FMA3,HW_CPU_X86_SVM,HW_CPU_X86_F16C,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_CLMUL,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_BMI2,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct 10 06:20:03 np0005479823 nova_compute[235775]: 2025-10-10 10:20:03.170 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:20:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:20:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:03.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:20:03 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:20:03 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1513687594' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:20:03 np0005479823 nova_compute[235775]: 2025-10-10 10:20:03.641 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:20:03 np0005479823 nova_compute[235775]: 2025-10-10 10:20:03.647 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct 10 06:20:03 np0005479823 nova_compute[235775]: 2025-10-10 10:20:03.665 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct 10 06:20:03 np0005479823 nova_compute[235775]: 2025-10-10 10:20:03.688 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct 10 06:20:03 np0005479823 nova_compute[235775]: 2025-10-10 10:20:03.689 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.721s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:20:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:04.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:20:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:05.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:20:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:20:05 np0005479823 podman[247418]: 2025-10-10 10:20:05.790683469 +0000 UTC m=+0.060121106 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20251001)
Oct 10 06:20:05 np0005479823 podman[247416]: 2025-10-10 10:20:05.79101943 +0000 UTC m=+0.062813113 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 06:20:05 np0005479823 podman[247417]: 2025-10-10 10:20:05.840644119 +0000 UTC m=+0.112504334 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Oct 10 06:20:05 np0005479823 ovn_controller[132503]: 2025-10-10T10:20:05Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:54:86:3b 10.100.0.5
Oct 10 06:20:05 np0005479823 ovn_controller[132503]: 2025-10-10T10:20:05Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:54:86:3b 10.100.0.5
Oct 10 06:20:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:20:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:20:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:20:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:20:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:20:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:06.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:20:06 np0005479823 nova_compute[235775]: 2025-10-10 10:20:06.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:20:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:07.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:20:07 np0005479823 nova_compute[235775]: 2025-10-10 10:20:07.685 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:20:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:07 np0005479823 nova_compute[235775]: 2025-10-10 10:20:07.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:20:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:08.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:20:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:20:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:09.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:20:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:20:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:20:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:20:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:20:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:20:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:20:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:10.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:20:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:11.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:11 np0005479823 nova_compute[235775]: 2025-10-10 10:20:11.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:20:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:12.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:20:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:12 np0005479823 nova_compute[235775]: 2025-10-10 10:20:12.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:20:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:13.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:20:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:14.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:20:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:20:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:20:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:20:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:20:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:15.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:20:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:20:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:16 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:20:16 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:20:16 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 06:20:16 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:20:16 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:20:16 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 06:20:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:16.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:16 np0005479823 nova_compute[235775]: 2025-10-10 10:20:16.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:17 np0005479823 ceph-mon[74913]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Oct 10 06:20:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:20:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:17.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:20:17 np0005479823 podman[247572]: 2025-10-10 10:20:17.778550578 +0000 UTC m=+0.050138256 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct 10 06:20:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:18 np0005479823 nova_compute[235775]: 2025-10-10 10:20:17.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:18.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:19.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:20:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:20:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:20:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:20:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:20:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:20.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:21.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:21 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:20:21 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:20:21 np0005479823 nova_compute[235775]: 2025-10-10 10:20:21.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:22.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:23 np0005479823 nova_compute[235775]: 2025-10-10 10:20:23.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:20:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:23.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:20:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:24.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:20:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:20:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:20:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:20:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:25.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:20:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Oct 10 06:20:26 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2156092452' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 06:20:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Oct 10 06:20:26 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2156092452' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 06:20:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:26.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:26 np0005479823 nova_compute[235775]: 2025-10-10 10:20:26.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:27.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:28 np0005479823 nova_compute[235775]: 2025-10-10 10:20:28.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:28.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:29.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:20:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:20:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:20:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:20:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:20:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:20:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:30.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:20:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:31.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:31 np0005479823 nova_compute[235775]: 2025-10-10 10:20:31.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:32 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:20:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:32.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:33 np0005479823 nova_compute[235775]: 2025-10-10 10:20:33.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:33.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:34.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:20:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:20:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:20:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:20:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:35.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:20:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:20:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:36.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:20:36 np0005479823 nova_compute[235775]: 2025-10-10 10:20:36.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:36 np0005479823 podman[247660]: 2025-10-10 10:20:36.814147702 +0000 UTC m=+0.074334830 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 10 06:20:36 np0005479823 podman[247661]: 2025-10-10 10:20:36.849930348 +0000 UTC m=+0.103348650 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, managed_by=edpm_ansible)
Oct 10 06:20:36 np0005479823 podman[247662]: 2025-10-10 10:20:36.861195289 +0000 UTC m=+0.103094892 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 10 06:20:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:37.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:38 np0005479823 nova_compute[235775]: 2025-10-10 10:20:38.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:38 np0005479823 nova_compute[235775]: 2025-10-10 10:20:38.172 2 INFO nova.compute.manager [None req-6ab692d3-34c4-4952-aa17-bb34c3d87ab3 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Get console output#033[00m
Oct 10 06:20:38 np0005479823 nova_compute[235775]: 2025-10-10 10:20:38.180 763 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct 10 06:20:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:20:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:38.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:20:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:38 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:20:38.889 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'da:dc:6a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '42:2f:dd:4e:d8:41'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct 10 06:20:38 np0005479823 nova_compute[235775]: 2025-10-10 10:20:38.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:38 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:20:38.890 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct 10 06:20:39 np0005479823 nova_compute[235775]: 2025-10-10 10:20:39.078 2 DEBUG nova.compute.manager [req-897afdf4-0444-4cdb-ae5f-d95185ab20e4 req-5b9e23af-e3da-4b91-9ce9-42a31aa7946c 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-changed-7369f952-1f44-445c-9449-347d6d476d79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct 10 06:20:39 np0005479823 nova_compute[235775]: 2025-10-10 10:20:39.079 2 DEBUG nova.compute.manager [req-897afdf4-0444-4cdb-ae5f-d95185ab20e4 req-5b9e23af-e3da-4b91-9ce9-42a31aa7946c 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Refreshing instance network info cache due to event network-changed-7369f952-1f44-445c-9449-347d6d476d79. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct 10 06:20:39 np0005479823 nova_compute[235775]: 2025-10-10 10:20:39.079 2 DEBUG oslo_concurrency.lockutils [req-897afdf4-0444-4cdb-ae5f-d95185ab20e4 req-5b9e23af-e3da-4b91-9ce9-42a31aa7946c 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct 10 06:20:39 np0005479823 nova_compute[235775]: 2025-10-10 10:20:39.080 2 DEBUG oslo_concurrency.lockutils [req-897afdf4-0444-4cdb-ae5f-d95185ab20e4 req-5b9e23af-e3da-4b91-9ce9-42a31aa7946c 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquired lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct 10 06:20:39 np0005479823 nova_compute[235775]: 2025-10-10 10:20:39.080 2 DEBUG nova.network.neutron [req-897afdf4-0444-4cdb-ae5f-d95185ab20e4 req-5b9e23af-e3da-4b91-9ce9-42a31aa7946c 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Refreshing network info cache for port 7369f952-1f44-445c-9449-347d6d476d79 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct 10 06:20:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:39.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:39 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:20:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:39 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:20:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:39 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:20:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:20:40 np0005479823 nova_compute[235775]: 2025-10-10 10:20:40.091 2 INFO nova.compute.manager [None req-f097cfbc-f867-41d2-9e1f-60cca500be17 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Get console output#033[00m
Oct 10 06:20:40 np0005479823 nova_compute[235775]: 2025-10-10 10:20:40.095 763 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct 10 06:20:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:20:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:20:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:40.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:20:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:41 np0005479823 nova_compute[235775]: 2025-10-10 10:20:41.183 2 DEBUG nova.compute.manager [req-0d5e56d7-51e7-418f-88fb-bd782f8d637a req-1e8739d7-3019-477b-8da6-71e2a6d794b9 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-vif-unplugged-7369f952-1f44-445c-9449-347d6d476d79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct 10 06:20:41 np0005479823 nova_compute[235775]: 2025-10-10 10:20:41.183 2 DEBUG oslo_concurrency.lockutils [req-0d5e56d7-51e7-418f-88fb-bd782f8d637a req-1e8739d7-3019-477b-8da6-71e2a6d794b9 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:20:41 np0005479823 nova_compute[235775]: 2025-10-10 10:20:41.184 2 DEBUG oslo_concurrency.lockutils [req-0d5e56d7-51e7-418f-88fb-bd782f8d637a req-1e8739d7-3019-477b-8da6-71e2a6d794b9 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:20:41 np0005479823 nova_compute[235775]: 2025-10-10 10:20:41.185 2 DEBUG oslo_concurrency.lockutils [req-0d5e56d7-51e7-418f-88fb-bd782f8d637a req-1e8739d7-3019-477b-8da6-71e2a6d794b9 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:20:41 np0005479823 nova_compute[235775]: 2025-10-10 10:20:41.185 2 DEBUG nova.compute.manager [req-0d5e56d7-51e7-418f-88fb-bd782f8d637a req-1e8739d7-3019-477b-8da6-71e2a6d794b9 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] No waiting events found dispatching network-vif-unplugged-7369f952-1f44-445c-9449-347d6d476d79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct 10 06:20:41 np0005479823 nova_compute[235775]: 2025-10-10 10:20:41.186 2 WARNING nova.compute.manager [req-0d5e56d7-51e7-418f-88fb-bd782f8d637a req-1e8739d7-3019-477b-8da6-71e2a6d794b9 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received unexpected event network-vif-unplugged-7369f952-1f44-445c-9449-347d6d476d79 for instance with vm_state active and task_state None.#033[00m
Oct 10 06:20:41 np0005479823 nova_compute[235775]: 2025-10-10 10:20:41.186 2 DEBUG nova.compute.manager [req-0d5e56d7-51e7-418f-88fb-bd782f8d637a req-1e8739d7-3019-477b-8da6-71e2a6d794b9 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct 10 06:20:41 np0005479823 nova_compute[235775]: 2025-10-10 10:20:41.187 2 DEBUG oslo_concurrency.lockutils [req-0d5e56d7-51e7-418f-88fb-bd782f8d637a req-1e8739d7-3019-477b-8da6-71e2a6d794b9 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:20:41 np0005479823 nova_compute[235775]: 2025-10-10 10:20:41.188 2 DEBUG oslo_concurrency.lockutils [req-0d5e56d7-51e7-418f-88fb-bd782f8d637a req-1e8739d7-3019-477b-8da6-71e2a6d794b9 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:20:41 np0005479823 nova_compute[235775]: 2025-10-10 10:20:41.189 2 DEBUG oslo_concurrency.lockutils [req-0d5e56d7-51e7-418f-88fb-bd782f8d637a req-1e8739d7-3019-477b-8da6-71e2a6d794b9 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:20:41 np0005479823 nova_compute[235775]: 2025-10-10 10:20:41.189 2 DEBUG nova.compute.manager [req-0d5e56d7-51e7-418f-88fb-bd782f8d637a req-1e8739d7-3019-477b-8da6-71e2a6d794b9 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] No waiting events found dispatching network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct 10 06:20:41 np0005479823 nova_compute[235775]: 2025-10-10 10:20:41.190 2 WARNING nova.compute.manager [req-0d5e56d7-51e7-418f-88fb-bd782f8d637a req-1e8739d7-3019-477b-8da6-71e2a6d794b9 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received unexpected event network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 for instance with vm_state active and task_state None.#033[00m
Oct 10 06:20:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:20:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:41.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:20:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:20:41.476 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:20:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:20:41.476 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:20:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:20:41.477 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:20:41 np0005479823 nova_compute[235775]: 2025-10-10 10:20:41.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:42.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:43 np0005479823 nova_compute[235775]: 2025-10-10 10:20:43.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:43.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:43 np0005479823 nova_compute[235775]: 2025-10-10 10:20:43.378 2 DEBUG nova.network.neutron [req-897afdf4-0444-4cdb-ae5f-d95185ab20e4 req-5b9e23af-e3da-4b91-9ce9-42a31aa7946c 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Updated VIF entry in instance network info cache for port 7369f952-1f44-445c-9449-347d6d476d79. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct 10 06:20:43 np0005479823 nova_compute[235775]: 2025-10-10 10:20:43.379 2 DEBUG nova.network.neutron [req-897afdf4-0444-4cdb-ae5f-d95185ab20e4 req-5b9e23af-e3da-4b91-9ce9-42a31aa7946c 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Updating instance_info_cache with network_info: [{"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct 10 06:20:43 np0005479823 nova_compute[235775]: 2025-10-10 10:20:43.404 2 DEBUG oslo_concurrency.lockutils [req-897afdf4-0444-4cdb-ae5f-d95185ab20e4 req-5b9e23af-e3da-4b91-9ce9-42a31aa7946c 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Releasing lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct 10 06:20:43 np0005479823 nova_compute[235775]: 2025-10-10 10:20:43.653 2 DEBUG nova.compute.manager [req-9fbbe017-8c18-4905-b94a-964d5c00e721 req-b66615ad-88d5-4854-89f6-df4a47c94912 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-changed-7369f952-1f44-445c-9449-347d6d476d79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct 10 06:20:43 np0005479823 nova_compute[235775]: 2025-10-10 10:20:43.654 2 DEBUG nova.compute.manager [req-9fbbe017-8c18-4905-b94a-964d5c00e721 req-b66615ad-88d5-4854-89f6-df4a47c94912 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Refreshing instance network info cache due to event network-changed-7369f952-1f44-445c-9449-347d6d476d79. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct 10 06:20:43 np0005479823 nova_compute[235775]: 2025-10-10 10:20:43.655 2 DEBUG oslo_concurrency.lockutils [req-9fbbe017-8c18-4905-b94a-964d5c00e721 req-b66615ad-88d5-4854-89f6-df4a47c94912 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct 10 06:20:43 np0005479823 nova_compute[235775]: 2025-10-10 10:20:43.655 2 DEBUG oslo_concurrency.lockutils [req-9fbbe017-8c18-4905-b94a-964d5c00e721 req-b66615ad-88d5-4854-89f6-df4a47c94912 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquired lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct 10 06:20:43 np0005479823 nova_compute[235775]: 2025-10-10 10:20:43.656 2 DEBUG nova.network.neutron [req-9fbbe017-8c18-4905-b94a-964d5c00e721 req-b66615ad-88d5-4854-89f6-df4a47c94912 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Refreshing network info cache for port 7369f952-1f44-445c-9449-347d6d476d79 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct 10 06:20:43 np0005479823 nova_compute[235775]: 2025-10-10 10:20:43.830 2 INFO nova.compute.manager [None req-27ba2d02-e255-4faf-afc6-87da70d38f8a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Get console output#033[00m
Oct 10 06:20:43 np0005479823 nova_compute[235775]: 2025-10-10 10:20:43.837 763 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct 10 06:20:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:44.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:44 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:20:44.892 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:20:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:20:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:20:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:20:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:20:45 np0005479823 nova_compute[235775]: 2025-10-10 10:20:45.140 2 DEBUG nova.network.neutron [req-9fbbe017-8c18-4905-b94a-964d5c00e721 req-b66615ad-88d5-4854-89f6-df4a47c94912 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Updated VIF entry in instance network info cache for port 7369f952-1f44-445c-9449-347d6d476d79. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct 10 06:20:45 np0005479823 nova_compute[235775]: 2025-10-10 10:20:45.140 2 DEBUG nova.network.neutron [req-9fbbe017-8c18-4905-b94a-964d5c00e721 req-b66615ad-88d5-4854-89f6-df4a47c94912 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Updating instance_info_cache with network_info: [{"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct 10 06:20:45 np0005479823 nova_compute[235775]: 2025-10-10 10:20:45.161 2 DEBUG oslo_concurrency.lockutils [req-9fbbe017-8c18-4905-b94a-964d5c00e721 req-b66615ad-88d5-4854-89f6-df4a47c94912 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Releasing lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct 10 06:20:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:20:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:45.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:20:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:20:45 np0005479823 nova_compute[235775]: 2025-10-10 10:20:45.773 2 DEBUG nova.compute.manager [req-471ab561-9162-46df-820b-dce4a6a8b53e req-e9ff2f14-4abf-465c-8768-f8583a909661 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct 10 06:20:45 np0005479823 nova_compute[235775]: 2025-10-10 10:20:45.774 2 DEBUG oslo_concurrency.lockutils [req-471ab561-9162-46df-820b-dce4a6a8b53e req-e9ff2f14-4abf-465c-8768-f8583a909661 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:20:45 np0005479823 nova_compute[235775]: 2025-10-10 10:20:45.774 2 DEBUG oslo_concurrency.lockutils [req-471ab561-9162-46df-820b-dce4a6a8b53e req-e9ff2f14-4abf-465c-8768-f8583a909661 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:20:45 np0005479823 nova_compute[235775]: 2025-10-10 10:20:45.775 2 DEBUG oslo_concurrency.lockutils [req-471ab561-9162-46df-820b-dce4a6a8b53e req-e9ff2f14-4abf-465c-8768-f8583a909661 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:20:45 np0005479823 nova_compute[235775]: 2025-10-10 10:20:45.775 2 DEBUG nova.compute.manager [req-471ab561-9162-46df-820b-dce4a6a8b53e req-e9ff2f14-4abf-465c-8768-f8583a909661 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] No waiting events found dispatching network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct 10 06:20:45 np0005479823 nova_compute[235775]: 2025-10-10 10:20:45.775 2 WARNING nova.compute.manager [req-471ab561-9162-46df-820b-dce4a6a8b53e req-e9ff2f14-4abf-465c-8768-f8583a909661 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received unexpected event network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 for instance with vm_state active and task_state None.#033[00m
Oct 10 06:20:45 np0005479823 nova_compute[235775]: 2025-10-10 10:20:45.775 2 DEBUG nova.compute.manager [req-471ab561-9162-46df-820b-dce4a6a8b53e req-e9ff2f14-4abf-465c-8768-f8583a909661 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct 10 06:20:45 np0005479823 nova_compute[235775]: 2025-10-10 10:20:45.776 2 DEBUG oslo_concurrency.lockutils [req-471ab561-9162-46df-820b-dce4a6a8b53e req-e9ff2f14-4abf-465c-8768-f8583a909661 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:20:45 np0005479823 nova_compute[235775]: 2025-10-10 10:20:45.776 2 DEBUG oslo_concurrency.lockutils [req-471ab561-9162-46df-820b-dce4a6a8b53e req-e9ff2f14-4abf-465c-8768-f8583a909661 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:20:45 np0005479823 nova_compute[235775]: 2025-10-10 10:20:45.776 2 DEBUG oslo_concurrency.lockutils [req-471ab561-9162-46df-820b-dce4a6a8b53e req-e9ff2f14-4abf-465c-8768-f8583a909661 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:20:45 np0005479823 nova_compute[235775]: 2025-10-10 10:20:45.777 2 DEBUG nova.compute.manager [req-471ab561-9162-46df-820b-dce4a6a8b53e req-e9ff2f14-4abf-465c-8768-f8583a909661 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] No waiting events found dispatching network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct 10 06:20:45 np0005479823 nova_compute[235775]: 2025-10-10 10:20:45.777 2 WARNING nova.compute.manager [req-471ab561-9162-46df-820b-dce4a6a8b53e req-e9ff2f14-4abf-465c-8768-f8583a909661 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received unexpected event network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 for instance with vm_state active and task_state None.#033[00m
Oct 10 06:20:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:46.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:46 np0005479823 nova_compute[235775]: 2025-10-10 10:20:46.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:20:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:47.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:20:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:48 np0005479823 nova_compute[235775]: 2025-10-10 10:20:48.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:48.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:48 np0005479823 podman[247762]: 2025-10-10 10:20:48.792622841 +0000 UTC m=+0.071257892 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct 10 06:20:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:49.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:20:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:20:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:20:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.216 2 DEBUG nova.compute.manager [req-4f8402f5-ebfa-402d-bf56-4715e53a561c req-3d7d74c9-030d-4944-93d3-da4a7cb4f958 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-changed-7369f952-1f44-445c-9449-347d6d476d79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.216 2 DEBUG nova.compute.manager [req-4f8402f5-ebfa-402d-bf56-4715e53a561c req-3d7d74c9-030d-4944-93d3-da4a7cb4f958 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Refreshing instance network info cache due to event network-changed-7369f952-1f44-445c-9449-347d6d476d79. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.217 2 DEBUG oslo_concurrency.lockutils [req-4f8402f5-ebfa-402d-bf56-4715e53a561c req-3d7d74c9-030d-4944-93d3-da4a7cb4f958 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.217 2 DEBUG oslo_concurrency.lockutils [req-4f8402f5-ebfa-402d-bf56-4715e53a561c req-3d7d74c9-030d-4944-93d3-da4a7cb4f958 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquired lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.217 2 DEBUG nova.network.neutron [req-4f8402f5-ebfa-402d-bf56-4715e53a561c req-3d7d74c9-030d-4944-93d3-da4a7cb4f958 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Refreshing network info cache for port 7369f952-1f44-445c-9449-347d6d476d79 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.287 2 DEBUG oslo_concurrency.lockutils [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "4fd38b02-f79c-4eb5-9939-6939dda28a15" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.288 2 DEBUG oslo_concurrency.lockutils [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.288 2 DEBUG oslo_concurrency.lockutils [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.288 2 DEBUG oslo_concurrency.lockutils [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.288 2 DEBUG oslo_concurrency.lockutils [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.289 2 INFO nova.compute.manager [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Terminating instance#033[00m
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.290 2 DEBUG nova.compute.manager [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct 10 06:20:50 np0005479823 kernel: tap7369f952-1f (unregistering): left promiscuous mode
Oct 10 06:20:50 np0005479823 NetworkManager[44866]: <info>  [1760091650.3325] device (tap7369f952-1f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 10 06:20:50 np0005479823 ovn_controller[132503]: 2025-10-10T10:20:50Z|00054|binding|INFO|Releasing lport 7369f952-1f44-445c-9449-347d6d476d79 from this chassis (sb_readonly=0)
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:50 np0005479823 ovn_controller[132503]: 2025-10-10T10:20:50Z|00055|binding|INFO|Setting lport 7369f952-1f44-445c-9449-347d6d476d79 down in Southbound
Oct 10 06:20:50 np0005479823 ovn_controller[132503]: 2025-10-10T10:20:50Z|00056|binding|INFO|Removing iface tap7369f952-1f ovn-installed in OVS
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:50 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:20:50.349 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:86:3b 10.100.0.5'], port_security=['fa:16:3e:54:86:3b 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '4fd38b02-f79c-4eb5-9939-6939dda28a15', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fb3e50c5-fe48-4113-87d7-4e11945ac752', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd5e531d4b440422d946eaf6fd4e166f7', 'neutron:revision_number': '8', 'neutron:security_group_ids': '26e88f36-7c05-4376-877b-78cbbe604817', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc36a9e4-a12c-4b9d-8968-49f72bde3476, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fa4a23cd910>], logical_port=7369f952-1f44-445c-9449-347d6d476d79) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fa4a23cd910>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct 10 06:20:50 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:20:50.351 141795 INFO neutron.agent.ovn.metadata.agent [-] Port 7369f952-1f44-445c-9449-347d6d476d79 in datapath fb3e50c5-fe48-4113-87d7-4e11945ac752 unbound from our chassis#033[00m
Oct 10 06:20:50 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:20:50.352 141795 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fb3e50c5-fe48-4113-87d7-4e11945ac752, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct 10 06:20:50 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:20:50.353 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[b8b1952c-1ffb-4e8c-8c7e-3c3bc90cbd0b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:20:50 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:20:50.354 141795 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752 namespace which is not needed anymore#033[00m
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:20:50 np0005479823 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Oct 10 06:20:50 np0005479823 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d0000000b.scope: Consumed 13.896s CPU time.
Oct 10 06:20:50 np0005479823 systemd-machined[192768]: Machine qemu-3-instance-0000000b terminated.
Oct 10 06:20:50 np0005479823 neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752[247318]: [NOTICE]   (247322) : haproxy version is 2.8.14-c23fe91
Oct 10 06:20:50 np0005479823 neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752[247318]: [NOTICE]   (247322) : path to executable is /usr/sbin/haproxy
Oct 10 06:20:50 np0005479823 neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752[247318]: [WARNING]  (247322) : Exiting Master process...
Oct 10 06:20:50 np0005479823 neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752[247318]: [WARNING]  (247322) : Exiting Master process...
Oct 10 06:20:50 np0005479823 neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752[247318]: [ALERT]    (247322) : Current worker (247324) exited with code 143 (Terminated)
Oct 10 06:20:50 np0005479823 neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752[247318]: [WARNING]  (247322) : All workers exited. Exiting... (0)
Oct 10 06:20:50 np0005479823 systemd[1]: libpod-c17bc6d04c5c8d9b08bd4a6519f32603e41d5aeccea881c08f4a084722fd0be3.scope: Deactivated successfully.
Oct 10 06:20:50 np0005479823 podman[247806]: 2025-10-10 10:20:50.490225056 +0000 UTC m=+0.041378995 container died c17bc6d04c5c8d9b08bd4a6519f32603e41d5aeccea881c08f4a084722fd0be3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 06:20:50 np0005479823 kernel: tap7369f952-1f: entered promiscuous mode
Oct 10 06:20:50 np0005479823 kernel: tap7369f952-1f (unregistering): left promiscuous mode
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:50 np0005479823 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c17bc6d04c5c8d9b08bd4a6519f32603e41d5aeccea881c08f4a084722fd0be3-userdata-shm.mount: Deactivated successfully.
Oct 10 06:20:50 np0005479823 systemd[1]: var-lib-containers-storage-overlay-46f11d884e8a0583bed848a92a7b64922d9783a2c80338a45ab3d568340bb2fc-merged.mount: Deactivated successfully.
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.527 2 INFO nova.virt.libvirt.driver [-] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Instance destroyed successfully.#033[00m
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.527 2 DEBUG nova.objects.instance [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lazy-loading 'resources' on Instance uuid 4fd38b02-f79c-4eb5-9939-6939dda28a15 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct 10 06:20:50 np0005479823 podman[247806]: 2025-10-10 10:20:50.532042475 +0000 UTC m=+0.083196414 container cleanup c17bc6d04c5c8d9b08bd4a6519f32603e41d5aeccea881c08f4a084722fd0be3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 06:20:50 np0005479823 systemd[1]: libpod-conmon-c17bc6d04c5c8d9b08bd4a6519f32603e41d5aeccea881c08f4a084722fd0be3.scope: Deactivated successfully.
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.542 2 DEBUG nova.virt.libvirt.vif [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-10T10:19:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-900143833',display_name='tempest-TestNetworkBasicOps-server-900143833',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-900143833',id=11,image_ref='5ae78700-970d-45b4-a57d-978a054c7519',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJQ6vYp+U8d7Yiink0K/iQNUjrLla5VjGnuqrTVtw+u6eTZg4qjU5w1TFNoLgk+EE3EJPtqEojXIPj0UMRCIST/kkZjRsWCJV3t0ho4U419OoM2lVk7/JJmPOAXOx5ZoVg==',key_name='tempest-TestNetworkBasicOps-780402283',keypairs=<?>,launch_index=0,launched_at=2025-10-10T10:19:53Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d5e531d4b440422d946eaf6fd4e166f7',ramdisk_id='',reservation_id='r-v13j2ta3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5ae78700-970d-45b4-a57d-978a054c7519',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-188749107',owner_user_name='tempest-TestNetworkBasicOps-188749107-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-10T10:19:53Z,user_data=None,user_id='7956778c03764aaf8906c9b435337976',uuid=4fd38b02-f79c-4eb5-9939-6939dda28a15,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.542 2 DEBUG nova.network.os_vif_util [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converting VIF {"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.543 2 DEBUG nova.network.os_vif_util [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:54:86:3b,bridge_name='br-int',has_traffic_filtering=True,id=7369f952-1f44-445c-9449-347d6d476d79,network=Network(fb3e50c5-fe48-4113-87d7-4e11945ac752),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7369f952-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.543 2 DEBUG os_vif [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:54:86:3b,bridge_name='br-int',has_traffic_filtering=True,id=7369f952-1f44-445c-9449-347d6d476d79,network=Network(fb3e50c5-fe48-4113-87d7-4e11945ac752),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7369f952-1f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.545 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7369f952-1f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.549 2 INFO os_vif [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:54:86:3b,bridge_name='br-int',has_traffic_filtering=True,id=7369f952-1f44-445c-9449-347d6d476d79,network=Network(fb3e50c5-fe48-4113-87d7-4e11945ac752),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7369f952-1f')#033[00m
Oct 10 06:20:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:50.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:50 np0005479823 podman[247843]: 2025-10-10 10:20:50.593008317 +0000 UTC m=+0.041489259 container remove c17bc6d04c5c8d9b08bd4a6519f32603e41d5aeccea881c08f4a084722fd0be3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 10 06:20:50 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:20:50.598 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[ab18cc26-656a-457d-b51c-5272a41541c6]: (4, ('Fri Oct 10 10:20:50 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752 (c17bc6d04c5c8d9b08bd4a6519f32603e41d5aeccea881c08f4a084722fd0be3)\nc17bc6d04c5c8d9b08bd4a6519f32603e41d5aeccea881c08f4a084722fd0be3\nFri Oct 10 10:20:50 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752 (c17bc6d04c5c8d9b08bd4a6519f32603e41d5aeccea881c08f4a084722fd0be3)\nc17bc6d04c5c8d9b08bd4a6519f32603e41d5aeccea881c08f4a084722fd0be3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:20:50 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:20:50.601 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[928587fb-369d-4c1a-9af4-798714ec91a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:20:50 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:20:50.602 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfb3e50c5-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:20:50 np0005479823 kernel: tapfb3e50c5-f0: left promiscuous mode
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:50 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:20:50.618 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[91e506e8-0471-4a19-a08b-095d4d50149e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:20:50 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:20:50.645 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[2e0026a8-6b02-42cc-ac63-7d80000867f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:20:50 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:20:50.647 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[f84d46cb-d8c9-4aec-ad8c-dc91990b3e88]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:20:50 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:20:50.667 241439 DEBUG oslo.privsep.daemon [-] privsep: reply[5ac4c78d-679f-4fdf-bf58-53d7c80db79e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449623, 'reachable_time': 35322, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247876, 'error': None, 'target': 'ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:20:50 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:20:50.669 141908 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fb3e50c5-fe48-4113-87d7-4e11945ac752 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct 10 06:20:50 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:20:50.669 141908 DEBUG oslo.privsep.daemon [-] privsep: reply[4097473c-9a9b-43e3-ab3b-036e441fad94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct 10 06:20:50 np0005479823 systemd[1]: run-netns-ovnmeta\x2dfb3e50c5\x2dfe48\x2d4113\x2d87d7\x2d4e11945ac752.mount: Deactivated successfully.
Oct 10 06:20:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.927 2 INFO nova.virt.libvirt.driver [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Deleting instance files /var/lib/nova/instances/4fd38b02-f79c-4eb5-9939-6939dda28a15_del#033[00m
Oct 10 06:20:50 np0005479823 nova_compute[235775]: 2025-10-10 10:20:50.929 2 INFO nova.virt.libvirt.driver [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Deletion of /var/lib/nova/instances/4fd38b02-f79c-4eb5-9939-6939dda28a15_del complete#033[00m
Oct 10 06:20:51 np0005479823 nova_compute[235775]: 2025-10-10 10:20:51.025 2 INFO nova.compute.manager [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Took 0.73 seconds to destroy the instance on the hypervisor.#033[00m
Oct 10 06:20:51 np0005479823 nova_compute[235775]: 2025-10-10 10:20:51.026 2 DEBUG oslo.service.loopingcall [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct 10 06:20:51 np0005479823 nova_compute[235775]: 2025-10-10 10:20:51.027 2 DEBUG nova.compute.manager [-] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct 10 06:20:51 np0005479823 nova_compute[235775]: 2025-10-10 10:20:51.027 2 DEBUG nova.network.neutron [-] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct 10 06:20:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:20:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:51.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:20:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:52 np0005479823 nova_compute[235775]: 2025-10-10 10:20:52.313 2 DEBUG nova.network.neutron [-] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct 10 06:20:52 np0005479823 nova_compute[235775]: 2025-10-10 10:20:52.317 2 DEBUG nova.compute.manager [req-02f61625-8adb-434b-bc91-ee8a9913b4e8 req-23159207-a6c7-4c17-9aa2-9ac282430abc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-vif-unplugged-7369f952-1f44-445c-9449-347d6d476d79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct 10 06:20:52 np0005479823 nova_compute[235775]: 2025-10-10 10:20:52.317 2 DEBUG oslo_concurrency.lockutils [req-02f61625-8adb-434b-bc91-ee8a9913b4e8 req-23159207-a6c7-4c17-9aa2-9ac282430abc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:20:52 np0005479823 nova_compute[235775]: 2025-10-10 10:20:52.317 2 DEBUG oslo_concurrency.lockutils [req-02f61625-8adb-434b-bc91-ee8a9913b4e8 req-23159207-a6c7-4c17-9aa2-9ac282430abc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:20:52 np0005479823 nova_compute[235775]: 2025-10-10 10:20:52.318 2 DEBUG oslo_concurrency.lockutils [req-02f61625-8adb-434b-bc91-ee8a9913b4e8 req-23159207-a6c7-4c17-9aa2-9ac282430abc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:20:52 np0005479823 nova_compute[235775]: 2025-10-10 10:20:52.318 2 DEBUG nova.compute.manager [req-02f61625-8adb-434b-bc91-ee8a9913b4e8 req-23159207-a6c7-4c17-9aa2-9ac282430abc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] No waiting events found dispatching network-vif-unplugged-7369f952-1f44-445c-9449-347d6d476d79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct 10 06:20:52 np0005479823 nova_compute[235775]: 2025-10-10 10:20:52.318 2 DEBUG nova.compute.manager [req-02f61625-8adb-434b-bc91-ee8a9913b4e8 req-23159207-a6c7-4c17-9aa2-9ac282430abc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-vif-unplugged-7369f952-1f44-445c-9449-347d6d476d79 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct 10 06:20:52 np0005479823 nova_compute[235775]: 2025-10-10 10:20:52.318 2 DEBUG nova.compute.manager [req-02f61625-8adb-434b-bc91-ee8a9913b4e8 req-23159207-a6c7-4c17-9aa2-9ac282430abc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct 10 06:20:52 np0005479823 nova_compute[235775]: 2025-10-10 10:20:52.318 2 DEBUG oslo_concurrency.lockutils [req-02f61625-8adb-434b-bc91-ee8a9913b4e8 req-23159207-a6c7-4c17-9aa2-9ac282430abc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Acquiring lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:20:52 np0005479823 nova_compute[235775]: 2025-10-10 10:20:52.319 2 DEBUG oslo_concurrency.lockutils [req-02f61625-8adb-434b-bc91-ee8a9913b4e8 req-23159207-a6c7-4c17-9aa2-9ac282430abc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:20:52 np0005479823 nova_compute[235775]: 2025-10-10 10:20:52.319 2 DEBUG oslo_concurrency.lockutils [req-02f61625-8adb-434b-bc91-ee8a9913b4e8 req-23159207-a6c7-4c17-9aa2-9ac282430abc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:20:52 np0005479823 nova_compute[235775]: 2025-10-10 10:20:52.319 2 DEBUG nova.compute.manager [req-02f61625-8adb-434b-bc91-ee8a9913b4e8 req-23159207-a6c7-4c17-9aa2-9ac282430abc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] No waiting events found dispatching network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct 10 06:20:52 np0005479823 nova_compute[235775]: 2025-10-10 10:20:52.319 2 WARNING nova.compute.manager [req-02f61625-8adb-434b-bc91-ee8a9913b4e8 req-23159207-a6c7-4c17-9aa2-9ac282430abc 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received unexpected event network-vif-plugged-7369f952-1f44-445c-9449-347d6d476d79 for instance with vm_state active and task_state deleting.#033[00m
Oct 10 06:20:52 np0005479823 nova_compute[235775]: 2025-10-10 10:20:52.333 2 INFO nova.compute.manager [-] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Took 1.31 seconds to deallocate network for instance.#033[00m
Oct 10 06:20:52 np0005479823 nova_compute[235775]: 2025-10-10 10:20:52.385 2 DEBUG oslo_concurrency.lockutils [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:20:52 np0005479823 nova_compute[235775]: 2025-10-10 10:20:52.385 2 DEBUG oslo_concurrency.lockutils [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:20:52 np0005479823 nova_compute[235775]: 2025-10-10 10:20:52.402 2 DEBUG nova.compute.manager [req-bc8c7d51-389f-40f9-bc98-0c593c7bf1d7 req-0aef2350-6518-4435-b667-891e289f5bfa 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Received event network-vif-deleted-7369f952-1f44-445c-9449-347d6d476d79 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct 10 06:20:52 np0005479823 nova_compute[235775]: 2025-10-10 10:20:52.447 2 DEBUG oslo_concurrency.processutils [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:20:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:20:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:52.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:20:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:52 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:20:52 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2442865633' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:20:52 np0005479823 nova_compute[235775]: 2025-10-10 10:20:52.916 2 DEBUG oslo_concurrency.processutils [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:20:52 np0005479823 nova_compute[235775]: 2025-10-10 10:20:52.923 2 DEBUG nova.compute.provider_tree [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct 10 06:20:52 np0005479823 nova_compute[235775]: 2025-10-10 10:20:52.940 2 DEBUG nova.scheduler.client.report [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct 10 06:20:52 np0005479823 nova_compute[235775]: 2025-10-10 10:20:52.971 2 DEBUG oslo_concurrency.lockutils [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.585s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:20:52 np0005479823 nova_compute[235775]: 2025-10-10 10:20:52.996 2 DEBUG nova.network.neutron [req-4f8402f5-ebfa-402d-bf56-4715e53a561c req-3d7d74c9-030d-4944-93d3-da4a7cb4f958 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Updated VIF entry in instance network info cache for port 7369f952-1f44-445c-9449-347d6d476d79. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct 10 06:20:52 np0005479823 nova_compute[235775]: 2025-10-10 10:20:52.996 2 DEBUG nova.network.neutron [req-4f8402f5-ebfa-402d-bf56-4715e53a561c req-3d7d74c9-030d-4944-93d3-da4a7cb4f958 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Updating instance_info_cache with network_info: [{"id": "7369f952-1f44-445c-9449-347d6d476d79", "address": "fa:16:3e:54:86:3b", "network": {"id": "fb3e50c5-fe48-4113-87d7-4e11945ac752", "bridge": "br-int", "label": "tempest-network-smoke--916183171", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d5e531d4b440422d946eaf6fd4e166f7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7369f952-1f", "ovs_interfaceid": "7369f952-1f44-445c-9449-347d6d476d79", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct 10 06:20:53 np0005479823 nova_compute[235775]: 2025-10-10 10:20:53.017 2 INFO nova.scheduler.client.report [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Deleted allocations for instance 4fd38b02-f79c-4eb5-9939-6939dda28a15#033[00m
Oct 10 06:20:53 np0005479823 nova_compute[235775]: 2025-10-10 10:20:53.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:53 np0005479823 nova_compute[235775]: 2025-10-10 10:20:53.022 2 DEBUG oslo_concurrency.lockutils [req-4f8402f5-ebfa-402d-bf56-4715e53a561c req-3d7d74c9-030d-4944-93d3-da4a7cb4f958 3358614a6ba84b89b10fe1d06ba95d87 4c8b489a4ba64bf4a262e05dd1b12019 - - default default] Releasing lock "refresh_cache-4fd38b02-f79c-4eb5-9939-6939dda28a15" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct 10 06:20:53 np0005479823 nova_compute[235775]: 2025-10-10 10:20:53.083 2 DEBUG oslo_concurrency.lockutils [None req-670ebc76-d287-4f26-8d5e-eb461bb8a47a 7956778c03764aaf8906c9b435337976 d5e531d4b440422d946eaf6fd4e166f7 - - default default] Lock "4fd38b02-f79c-4eb5-9939-6939dda28a15" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.795s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:20:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:53.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:54.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:54 np0005479823 nova_compute[235775]: 2025-10-10 10:20:54.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:20:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:20:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:20:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:20:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:20:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:55.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:20:55 np0005479823 nova_compute[235775]: 2025-10-10 10:20:55.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:55 np0005479823 nova_compute[235775]: 2025-10-10 10:20:55.809 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:20:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:56.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:56 np0005479823 nova_compute[235775]: 2025-10-10 10:20:56.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:20:56 np0005479823 nova_compute[235775]: 2025-10-10 10:20:56.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct 10 06:20:56 np0005479823 nova_compute[235775]: 2025-10-10 10:20:56.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct 10 06:20:56 np0005479823 nova_compute[235775]: 2025-10-10 10:20:56.839 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct 10 06:20:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:57.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:57 np0005479823 nova_compute[235775]: 2025-10-10 10:20:57.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:20:57 np0005479823 nova_compute[235775]: 2025-10-10 10:20:57.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:20:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:57 np0005479823 nova_compute[235775]: 2025-10-10 10:20:57.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:58 np0005479823 nova_compute[235775]: 2025-10-10 10:20:58.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:58 np0005479823 nova_compute[235775]: 2025-10-10 10:20:58.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:20:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:20:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:20:58.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:20:58 np0005479823 nova_compute[235775]: 2025-10-10 10:20:58.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:20:58 np0005479823 nova_compute[235775]: 2025-10-10 10:20:58.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:20:58 np0005479823 nova_compute[235775]: 2025-10-10 10:20:58.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct 10 06:20:58 np0005479823 nova_compute[235775]: 2025-10-10 10:20:58.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:20:58 np0005479823 nova_compute[235775]: 2025-10-10 10:20:58.842 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:20:58 np0005479823 nova_compute[235775]: 2025-10-10 10:20:58.843 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:20:58 np0005479823 nova_compute[235775]: 2025-10-10 10:20:58.843 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:20:58 np0005479823 nova_compute[235775]: 2025-10-10 10:20:58.843 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct 10 06:20:58 np0005479823 nova_compute[235775]: 2025-10-10 10:20:58.843 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:20:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:59 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:20:59 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3082504773' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:20:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:20:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:20:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:20:59.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:20:59 np0005479823 nova_compute[235775]: 2025-10-10 10:20:59.308 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:20:59 np0005479823 nova_compute[235775]: 2025-10-10 10:20:59.474 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct 10 06:20:59 np0005479823 nova_compute[235775]: 2025-10-10 10:20:59.475 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4897MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct 10 06:20:59 np0005479823 nova_compute[235775]: 2025-10-10 10:20:59.475 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:20:59 np0005479823 nova_compute[235775]: 2025-10-10 10:20:59.476 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:20:59 np0005479823 nova_compute[235775]: 2025-10-10 10:20:59.565 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct 10 06:20:59 np0005479823 nova_compute[235775]: 2025-10-10 10:20:59.565 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct 10 06:20:59 np0005479823 nova_compute[235775]: 2025-10-10 10:20:59.585 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:20:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:20:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:20:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:20:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:21:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:21:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:20:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:21:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:21:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:21:00 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4120952754' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:21:00 np0005479823 nova_compute[235775]: 2025-10-10 10:21:00.044 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:21:00 np0005479823 nova_compute[235775]: 2025-10-10 10:21:00.051 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct 10 06:21:00 np0005479823 nova_compute[235775]: 2025-10-10 10:21:00.067 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct 10 06:21:00 np0005479823 nova_compute[235775]: 2025-10-10 10:21:00.093 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct 10 06:21:00 np0005479823 nova_compute[235775]: 2025-10-10 10:21:00.093 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.618s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:21:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:21:00 np0005479823 nova_compute[235775]: 2025-10-10 10:21:00.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:21:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:00.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:01 np0005479823 nova_compute[235775]: 2025-10-10 10:21:01.094 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:21:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:01.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:01 np0005479823 nova_compute[235775]: 2025-10-10 10:21:01.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:21:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:02.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:02 np0005479823 nova_compute[235775]: 2025-10-10 10:21:02.838 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:21:02 np0005479823 nova_compute[235775]: 2025-10-10 10:21:02.839 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct 10 06:21:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:03 np0005479823 nova_compute[235775]: 2025-10-10 10:21:03.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:21:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:03.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:04.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:21:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:21:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:21:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:21:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:05.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:21:05 np0005479823 nova_compute[235775]: 2025-10-10 10:21:05.526 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1760091650.5248919, 4fd38b02-f79c-4eb5-9939-6939dda28a15 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct 10 06:21:05 np0005479823 nova_compute[235775]: 2025-10-10 10:21:05.526 2 INFO nova.compute.manager [-] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] VM Stopped (Lifecycle Event)#033[00m
Oct 10 06:21:05 np0005479823 nova_compute[235775]: 2025-10-10 10:21:05.547 2 DEBUG nova.compute.manager [None req-da73437c-51ac-4dca-9c3c-3173ab5bd15c - - - - - -] [instance: 4fd38b02-f79c-4eb5-9939-6939dda28a15] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct 10 06:21:05 np0005479823 nova_compute[235775]: 2025-10-10 10:21:05.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:21:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:06.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:06 np0005479823 nova_compute[235775]: 2025-10-10 10:21:06.832 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:21:06 np0005479823 nova_compute[235775]: 2025-10-10 10:21:06.833 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct 10 06:21:06 np0005479823 nova_compute[235775]: 2025-10-10 10:21:06.851 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct 10 06:21:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:07.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:07 np0005479823 podman[247988]: 2025-10-10 10:21:07.788366519 +0000 UTC m=+0.059820136 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible)
Oct 10 06:21:07 np0005479823 podman[247990]: 2025-10-10 10:21:07.809383483 +0000 UTC m=+0.074854329 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 10 06:21:07 np0005479823 podman[247989]: 2025-10-10 10:21:07.81055783 +0000 UTC m=+0.079482266 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 06:21:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:08 np0005479823 nova_compute[235775]: 2025-10-10 10:21:08.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:21:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:08.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:21:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:09.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:21:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:21:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:21:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:21:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:21:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:21:10 np0005479823 nova_compute[235775]: 2025-10-10 10:21:10.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:21:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:10.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:11.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:12.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:13 np0005479823 nova_compute[235775]: 2025-10-10 10:21:13.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:21:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:21:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:13.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:21:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:14.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:21:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:21:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:21:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:21:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:15.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:21:15 np0005479823 nova_compute[235775]: 2025-10-10 10:21:15.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:21:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:16.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:21:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:17.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:21:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:18 np0005479823 nova_compute[235775]: 2025-10-10 10:21:18.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:21:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:18.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:19.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:19 np0005479823 podman[248063]: 2025-10-10 10:21:19.794186015 +0000 UTC m=+0.063664909 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 06:21:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:21:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:21:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:21:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:21:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:21:20 np0005479823 nova_compute[235775]: 2025-10-10 10:21:20.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:21:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:20.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:21.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:22 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 06:21:22 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:21:22 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:21:22 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 06:21:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:22.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:23 np0005479823 nova_compute[235775]: 2025-10-10 10:21:23.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:21:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:23.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:24.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:21:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:21:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:21:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:21:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:21:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:25.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:21:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:21:25 np0005479823 nova_compute[235775]: 2025-10-10 10:21:25.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:21:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:26.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:27.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:27 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:21:27 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:21:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:28 np0005479823 nova_compute[235775]: 2025-10-10 10:21:28.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:21:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:28.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:29.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:21:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:21:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:21:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:21:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:21:30 np0005479823 nova_compute[235775]: 2025-10-10 10:21:30.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:21:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:21:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:30.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:21:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:31.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:21:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:32.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:21:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:33 np0005479823 nova_compute[235775]: 2025-10-10 10:21:33.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:21:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:21:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:33.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:21:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.034772) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091694034910, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2373, "num_deletes": 251, "total_data_size": 6361445, "memory_usage": 6441408, "flush_reason": "Manual Compaction"}
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091694058893, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 4092731, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31338, "largest_seqno": 33706, "table_properties": {"data_size": 4083132, "index_size": 6029, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20048, "raw_average_key_size": 20, "raw_value_size": 4063987, "raw_average_value_size": 4155, "num_data_blocks": 259, "num_entries": 978, "num_filter_entries": 978, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760091492, "oldest_key_time": 1760091492, "file_creation_time": 1760091694, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 24157 microseconds, and 15143 cpu microseconds.
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.058948) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 4092731 bytes OK
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.058971) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.061897) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.061918) EVENT_LOG_v1 {"time_micros": 1760091694061912, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.061935) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 6350967, prev total WAL file size 6350967, number of live WAL files 2.
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.063215) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032353130' seq:72057594037927935, type:22 .. '7061786F730032373632' seq:0, type:0; will stop at (end)
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(3996KB)], [60(11MB)]
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091694063238, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 16141387, "oldest_snapshot_seqno": -1}
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 6210 keys, 14032947 bytes, temperature: kUnknown
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091694122602, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 14032947, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13992364, "index_size": 23961, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15557, "raw_key_size": 159050, "raw_average_key_size": 25, "raw_value_size": 13881461, "raw_average_value_size": 2235, "num_data_blocks": 964, "num_entries": 6210, "num_filter_entries": 6210, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760091694, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.122802) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 14032947 bytes
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.124699) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 271.6 rd, 236.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.9, 11.5 +0.0 blob) out(13.4 +0.0 blob), read-write-amplify(7.4) write-amplify(3.4) OK, records in: 6731, records dropped: 521 output_compression: NoCompression
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.124735) EVENT_LOG_v1 {"time_micros": 1760091694124721, "job": 36, "event": "compaction_finished", "compaction_time_micros": 59427, "compaction_time_cpu_micros": 26458, "output_level": 6, "num_output_files": 1, "total_output_size": 14032947, "num_input_records": 6731, "num_output_records": 6210, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091694125706, "job": 36, "event": "table_file_deletion", "file_number": 62}
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091694127720, "job": 36, "event": "table_file_deletion", "file_number": 60}
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.063117) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.127782) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.127789) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.127791) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.127793) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:21:34 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:21:34.127795) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:21:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000065s ======
Oct 10 06:21:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:34.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000065s
Oct 10 06:21:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:21:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:21:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:21:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:21:35 np0005479823 ovn_controller[132503]: 2025-10-10T10:21:35Z|00057|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Oct 10 06:21:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:21:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:35.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:21:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:21:35 np0005479823 nova_compute[235775]: 2025-10-10 10:21:35.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:21:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:36.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:37.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:38 np0005479823 nova_compute[235775]: 2025-10-10 10:21:38.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:21:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:38.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:38 np0005479823 podman[248237]: 2025-10-10 10:21:38.818621384 +0000 UTC m=+0.072067458 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 10 06:21:38 np0005479823 podman[248235]: 2025-10-10 10:21:38.829455151 +0000 UTC m=+0.090359313 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 10 06:21:38 np0005479823 podman[248236]: 2025-10-10 10:21:38.8534548 +0000 UTC m=+0.110294073 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 10 06:21:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:39.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:21:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:21:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:21:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:21:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:21:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:40.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:40 np0005479823 nova_compute[235775]: 2025-10-10 10:21:40.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:21:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:41.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:21:41.477 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:21:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:21:41.478 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:21:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:21:41.478 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:21:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:42.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:43 np0005479823 nova_compute[235775]: 2025-10-10 10:21:43.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:21:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000065s ======
Oct 10 06:21:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:43.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000065s
Oct 10 06:21:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:44.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:21:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:21:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:21:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:21:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:45.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:21:45 np0005479823 nova_compute[235775]: 2025-10-10 10:21:45.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:21:45 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:21:45.835 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'da:dc:6a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '42:2f:dd:4e:d8:41'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct 10 06:21:45 np0005479823 nova_compute[235775]: 2025-10-10 10:21:45.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:21:45 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:21:45.836 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct 10 06:21:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:21:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:46.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:21:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:47.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:48 np0005479823 nova_compute[235775]: 2025-10-10 10:21:48.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:21:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:48.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:21:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:49.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:21:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:21:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:21:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:21:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:21:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:21:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:50.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:50 np0005479823 nova_compute[235775]: 2025-10-10 10:21:50.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:21:50 np0005479823 podman[248333]: 2025-10-10 10:21:50.775251233 +0000 UTC m=+0.053454962 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent)
Oct 10 06:21:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:21:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:51.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:21:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:52 np0005479823 systemd[1]: virtsecretd.service: Deactivated successfully.
Oct 10 06:21:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:21:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:52.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:21:52 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:21:52.837 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:21:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:53 np0005479823 nova_compute[235775]: 2025-10-10 10:21:53.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:21:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:21:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:53.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:21:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:54.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:21:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:21:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:21:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:21:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:21:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:55.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:21:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:21:55 np0005479823 nova_compute[235775]: 2025-10-10 10:21:55.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:21:55 np0005479823 nova_compute[235775]: 2025-10-10 10:21:55.832 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:21:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:56.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:21:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:57.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:21:57 np0005479823 nova_compute[235775]: 2025-10-10 10:21:57.809 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:21:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:58 np0005479823 nova_compute[235775]: 2025-10-10 10:21:58.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:21:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:21:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:21:58.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:21:58 np0005479823 nova_compute[235775]: 2025-10-10 10:21:58.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:21:58 np0005479823 nova_compute[235775]: 2025-10-10 10:21:58.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct 10 06:21:58 np0005479823 nova_compute[235775]: 2025-10-10 10:21:58.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct 10 06:21:58 np0005479823 nova_compute[235775]: 2025-10-10 10:21:58.847 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct 10 06:21:58 np0005479823 nova_compute[235775]: 2025-10-10 10:21:58.847 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:21:58 np0005479823 nova_compute[235775]: 2025-10-10 10:21:58.847 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:21:58 np0005479823 nova_compute[235775]: 2025-10-10 10:21:58.875 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:21:58 np0005479823 nova_compute[235775]: 2025-10-10 10:21:58.876 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:21:58 np0005479823 nova_compute[235775]: 2025-10-10 10:21:58.876 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:21:58 np0005479823 nova_compute[235775]: 2025-10-10 10:21:58.876 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct 10 06:21:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:58 np0005479823 nova_compute[235775]: 2025-10-10 10:21:58.877 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:21:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:59 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:21:59 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4151122925' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:21:59 np0005479823 nova_compute[235775]: 2025-10-10 10:21:59.329 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:21:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:21:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:21:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:21:59.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:21:59 np0005479823 nova_compute[235775]: 2025-10-10 10:21:59.525 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct 10 06:21:59 np0005479823 nova_compute[235775]: 2025-10-10 10:21:59.526 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4903MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct 10 06:21:59 np0005479823 nova_compute[235775]: 2025-10-10 10:21:59.527 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:21:59 np0005479823 nova_compute[235775]: 2025-10-10 10:21:59.527 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:21:59 np0005479823 nova_compute[235775]: 2025-10-10 10:21:59.599 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct 10 06:21:59 np0005479823 nova_compute[235775]: 2025-10-10 10:21:59.600 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct 10 06:21:59 np0005479823 nova_compute[235775]: 2025-10-10 10:21:59.863 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:21:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:21:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:21:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:21:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:22:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:22:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:21:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:22:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:22:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:22:00 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2003623' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:22:00 np0005479823 nova_compute[235775]: 2025-10-10 10:22:00.364 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:22:00 np0005479823 nova_compute[235775]: 2025-10-10 10:22:00.373 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct 10 06:22:00 np0005479823 nova_compute[235775]: 2025-10-10 10:22:00.395 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct 10 06:22:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:22:00 np0005479823 nova_compute[235775]: 2025-10-10 10:22:00.400 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct 10 06:22:00 np0005479823 nova_compute[235775]: 2025-10-10 10:22:00.400 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.873s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:22:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:22:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:00.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:22:00 np0005479823 nova_compute[235775]: 2025-10-10 10:22:00.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:22:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:01 np0005479823 nova_compute[235775]: 2025-10-10 10:22:01.369 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:22:01 np0005479823 nova_compute[235775]: 2025-10-10 10:22:01.370 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:22:01 np0005479823 nova_compute[235775]: 2025-10-10 10:22:01.370 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:22:01 np0005479823 nova_compute[235775]: 2025-10-10 10:22:01.370 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:22:01 np0005479823 nova_compute[235775]: 2025-10-10 10:22:01.370 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct 10 06:22:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:22:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:01.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:22:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:22:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:02.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:22:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:03 np0005479823 nova_compute[235775]: 2025-10-10 10:22:03.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:22:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:03.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:03 np0005479823 nova_compute[235775]: 2025-10-10 10:22:03.811 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:22:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:04.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:22:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:22:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:22:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:22:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:05.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:22:05 np0005479823 nova_compute[235775]: 2025-10-10 10:22:05.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:22:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:22:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:06.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:22:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:22:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:07.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:22:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:08 np0005479823 nova_compute[235775]: 2025-10-10 10:22:08.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:22:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:08.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:09.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:09 np0005479823 podman[248443]: 2025-10-10 10:22:09.812499453 +0000 UTC m=+0.069370662 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct 10 06:22:09 np0005479823 podman[248441]: 2025-10-10 10:22:09.816568013 +0000 UTC m=+0.080976573 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true)
Oct 10 06:22:09 np0005479823 podman[248442]: 2025-10-10 10:22:09.836737439 +0000 UTC m=+0.106345586 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct 10 06:22:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:22:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:22:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:22:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:10 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:22:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:22:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:10.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:10 np0005479823 nova_compute[235775]: 2025-10-10 10:22:10.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:22:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:11.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:12.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:13 np0005479823 nova_compute[235775]: 2025-10-10 10:22:13.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:22:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:13.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:22:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:14.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:22:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:14 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:22:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:14 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:22:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:14 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:22:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:15 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:22:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:22:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:15.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:22:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:22:15 np0005479823 nova_compute[235775]: 2025-10-10 10:22:15.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:22:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:22:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:16.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:22:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:22:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:17.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:22:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:18 np0005479823 nova_compute[235775]: 2025-10-10 10:22:18.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:22:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:18.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:22:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:19.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:22:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:22:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:22:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:22:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:20 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:22:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:22:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:22:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:20.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:22:20 np0005479823 nova_compute[235775]: 2025-10-10 10:22:20.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:22:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:21 np0005479823 podman[248546]: 2025-10-10 10:22:21.044134308 +0000 UTC m=+0.049105383 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 10 06:22:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:21.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:22:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:22.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:22:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:23 np0005479823 nova_compute[235775]: 2025-10-10 10:22:23.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:22:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:23.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:22:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:24.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:22:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:22:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:22:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:22:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:25 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:22:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:25.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:22:25 np0005479823 nova_compute[235775]: 2025-10-10 10:22:25.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:22:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Oct 10 06:22:26 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4103532067' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 06:22:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Oct 10 06:22:26 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4103532067' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 06:22:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:26.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:22:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:27.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:22:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:28 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 06:22:28 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:22:28 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:22:28 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 06:22:28 np0005479823 nova_compute[235775]: 2025-10-10 10:22:28.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:22:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:28.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:22:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:29.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.460335) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091749460427, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 777, "num_deletes": 250, "total_data_size": 1453649, "memory_usage": 1476816, "flush_reason": "Manual Compaction"}
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091749468261, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 956064, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33712, "largest_seqno": 34483, "table_properties": {"data_size": 952472, "index_size": 1436, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 7475, "raw_average_key_size": 17, "raw_value_size": 945175, "raw_average_value_size": 2172, "num_data_blocks": 64, "num_entries": 435, "num_filter_entries": 435, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760091694, "oldest_key_time": 1760091694, "file_creation_time": 1760091749, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 7965 microseconds, and 4096 cpu microseconds.
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.468313) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 956064 bytes OK
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.468333) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.470003) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.470024) EVENT_LOG_v1 {"time_micros": 1760091749470017, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.470045) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 1449580, prev total WAL file size 1449580, number of live WAL files 2.
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.470805) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B7600323531' seq:72057594037927935, type:22 .. '6B7600353032' seq:0, type:0; will stop at (end)
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(933KB)], [63(13MB)]
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091749470886, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 14989011, "oldest_snapshot_seqno": -1}
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 6133 keys, 13814758 bytes, temperature: kUnknown
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091749560557, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 13814758, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13774532, "index_size": 23796, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15365, "raw_key_size": 159160, "raw_average_key_size": 25, "raw_value_size": 13664685, "raw_average_value_size": 2228, "num_data_blocks": 942, "num_entries": 6133, "num_filter_entries": 6133, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760091749, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.560803) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 13814758 bytes
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.562372) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 167.1 rd, 154.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 13.4 +0.0 blob) out(13.2 +0.0 blob), read-write-amplify(30.1) write-amplify(14.4) OK, records in: 6645, records dropped: 512 output_compression: NoCompression
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.562404) EVENT_LOG_v1 {"time_micros": 1760091749562390, "job": 38, "event": "compaction_finished", "compaction_time_micros": 89725, "compaction_time_cpu_micros": 37217, "output_level": 6, "num_output_files": 1, "total_output_size": 13814758, "num_input_records": 6645, "num_output_records": 6133, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091749562898, "job": 38, "event": "table_file_deletion", "file_number": 65}
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091749567720, "job": 38, "event": "table_file_deletion", "file_number": 63}
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.470728) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.567789) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.567794) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.567796) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.567800) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:22:29 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:22:29.567802) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:22:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:22:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:22:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:22:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:30 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:22:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:22:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:30.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:30 np0005479823 nova_compute[235775]: 2025-10-10 10:22:30.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:22:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:31.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:32.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:33 np0005479823 nova_compute[235775]: 2025-10-10 10:22:33.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:22:33 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:22:33 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:22:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:22:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:33.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:22:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:22:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:34.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:22:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:22:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:22:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:22:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:35 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:22:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:22:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:22:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:35.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:22:35 np0005479823 nova_compute[235775]: 2025-10-10 10:22:35.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:22:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:36.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:37.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:38 np0005479823 nova_compute[235775]: 2025-10-10 10:22:38.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:22:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:22:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:38.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:22:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:39.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:39 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:22:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:39 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:22:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:39 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:22:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:40 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:22:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:22:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:22:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:40.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:22:40 np0005479823 nova_compute[235775]: 2025-10-10 10:22:40.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:22:40 np0005479823 podman[248701]: 2025-10-10 10:22:40.808069406 +0000 UTC m=+0.070320798 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2)
Oct 10 06:22:40 np0005479823 podman[248703]: 2025-10-10 10:22:40.82851417 +0000 UTC m=+0.082618141 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 06:22:40 np0005479823 podman[248702]: 2025-10-10 10:22:40.841773733 +0000 UTC m=+0.101797934 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 10 06:22:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:41.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:22:41.479 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:22:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:22:41.479 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:22:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:22:41.479 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:22:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:22:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:42.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:22:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:43 np0005479823 nova_compute[235775]: 2025-10-10 10:22:43.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:22:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:43.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:43 np0005479823 systemd-logind[796]: New session 57 of user zuul.
Oct 10 06:22:43 np0005479823 systemd[1]: Started Session 57 of User zuul.
Oct 10 06:22:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:44.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:22:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:22:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:22:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:45 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:22:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:22:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:45.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:45 np0005479823 nova_compute[235775]: 2025-10-10 10:22:45.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:22:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:46.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:47 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Oct 10 06:22:47 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3034678619' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Oct 10 06:22:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:47.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:48 np0005479823 nova_compute[235775]: 2025-10-10 10:22:48.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:22:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:22:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:48.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:22:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:49.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:22:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:22:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:22:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:50 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:22:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:22:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:50.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:50 np0005479823 nova_compute[235775]: 2025-10-10 10:22:50.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:22:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:22:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:51.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:22:51 np0005479823 podman[249136]: 2025-10-10 10:22:51.789557233 +0000 UTC m=+0.064001696 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 10 06:22:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:52.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:52 np0005479823 ovs-vsctl[249187]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Oct 10 06:22:53 np0005479823 nova_compute[235775]: 2025-10-10 10:22:53.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:22:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:22:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:53.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:22:53 np0005479823 virtqemud[235088]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Oct 10 06:22:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:53 np0005479823 virtqemud[235088]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Oct 10 06:22:53 np0005479823 virtqemud[235088]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct 10 06:22:54 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: cache status {prefix=cache status} (starting...)
Oct 10 06:22:54 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: client ls {prefix=client ls} (starting...)
Oct 10 06:22:54 np0005479823 lvm[249544]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Oct 10 06:22:54 np0005479823 lvm[249544]: VG ceph_vg0 finished
Oct 10 06:22:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:54.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:54 np0005479823 kernel: block dm-0: the capability attribute has been deprecated.
Oct 10 06:22:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:22:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:22:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:22:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:55 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:22:55 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: damage ls {prefix=damage ls} (starting...)
Oct 10 06:22:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Oct 10 06:22:55 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1672425821' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Oct 10 06:22:55 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: dump loads {prefix=dump loads} (starting...)
Oct 10 06:22:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:22:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:22:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:55.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:22:55 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Oct 10 06:22:55 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Oct 10 06:22:55 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Oct 10 06:22:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Oct 10 06:22:55 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1860179327' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 06:22:55 np0005479823 nova_compute[235775]: 2025-10-10 10:22:55.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:22:55 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Oct 10 06:22:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:56 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Oct 10 06:22:56 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4222220785' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Oct 10 06:22:56 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Oct 10 06:22:56 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: get subtrees {prefix=get subtrees} (starting...)
Oct 10 06:22:56 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: ops {prefix=ops} (starting...)
Oct 10 06:22:56 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Oct 10 06:22:56 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3187780686' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Oct 10 06:22:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:56.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:56 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Oct 10 06:22:56 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2388601897' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Oct 10 06:22:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:57 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: session ls {prefix=session ls} (starting...)
Oct 10 06:22:57 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Oct 10 06:22:57 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1390305313' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Oct 10 06:22:57 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: status {prefix=status} (starting...)
Oct 10 06:22:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:57.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:57 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Oct 10 06:22:57 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/778203399' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Oct 10 06:22:57 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Oct 10 06:22:57 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2332321882' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Oct 10 06:22:57 np0005479823 nova_compute[235775]: 2025-10-10 10:22:57.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:22:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:58 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Oct 10 06:22:58 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3063974506' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Oct 10 06:22:58 np0005479823 nova_compute[235775]: 2025-10-10 10:22:58.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:22:58 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Oct 10 06:22:58 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1401553685' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Oct 10 06:22:58 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Oct 10 06:22:58 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/512450568' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Oct 10 06:22:58 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Oct 10 06:22:58 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/832198139' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Oct 10 06:22:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:22:58.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:58 np0005479823 nova_compute[235775]: 2025-10-10 10:22:58.809 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:22:58 np0005479823 nova_compute[235775]: 2025-10-10 10:22:58.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:22:58 np0005479823 nova_compute[235775]: 2025-10-10 10:22:58.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:22:58 np0005479823 nova_compute[235775]: 2025-10-10 10:22:58.841 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:22:58 np0005479823 nova_compute[235775]: 2025-10-10 10:22:58.841 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:22:58 np0005479823 nova_compute[235775]: 2025-10-10 10:22:58.841 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:22:58 np0005479823 nova_compute[235775]: 2025-10-10 10:22:58.842 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct 10 06:22:58 np0005479823 nova_compute[235775]: 2025-10-10 10:22:58.842 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:22:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:58 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Oct 10 06:22:58 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1758553721' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Oct 10 06:22:58 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Oct 10 06:22:58 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4260196095' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Oct 10 06:22:59 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Oct 10 06:22:59 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/699035021' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Oct 10 06:22:59 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:22:59 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2142363579' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:22:59 np0005479823 nova_compute[235775]: 2025-10-10 10:22:59.293 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:22:59 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Oct 10 06:22:59 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4087547626' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Oct 10 06:22:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:22:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:22:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:22:59.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:22:59 np0005479823 nova_compute[235775]: 2025-10-10 10:22:59.457 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct 10 06:22:59 np0005479823 nova_compute[235775]: 2025-10-10 10:22:59.459 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4776MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct 10 06:22:59 np0005479823 nova_compute[235775]: 2025-10-10 10:22:59.460 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:22:59 np0005479823 nova_compute[235775]: 2025-10-10 10:22:59.460 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:22:59 np0005479823 nova_compute[235775]: 2025-10-10 10:22:59.533 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct 10 06:22:59 np0005479823 nova_compute[235775]: 2025-10-10 10:22:59.534 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct 10 06:22:59 np0005479823 nova_compute[235775]: 2025-10-10 10:22:59.557 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:22:59 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Oct 10 06:22:59 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1467050544' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Oct 10 06:22:59 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Oct 10 06:22:59 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2700707402' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Oct 10 06:22:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:22:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:22:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:22:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:23:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:23:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:22:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:23:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:00 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:23:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:23:00 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1751714302' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:23:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Oct 10 06:23:00 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2444127767' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Oct 10 06:23:00 np0005479823 nova_compute[235775]: 2025-10-10 10:23:00.043 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:23:00 np0005479823 nova_compute[235775]: 2025-10-10 10:23:00.048 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct 10 06:23:00 np0005479823 nova_compute[235775]: 2025-10-10 10:23:00.083 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct 10 06:23:00 np0005479823 nova_compute[235775]: 2025-10-10 10:23:00.085 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct 10 06:23:00 np0005479823 nova_compute[235775]: 2025-10-10 10:23:00.085 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.625s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:23:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:23:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Oct 10 06:23:00 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/670445082' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876129 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68894720 unmapped: 1318912 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68894720 unmapped: 1318912 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68902912 unmapped: 1310720 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68902912 unmapped: 1310720 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68902912 unmapped: 1310720 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876129 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68911104 unmapped: 1302528 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68911104 unmapped: 1302528 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68919296 unmapped: 1294336 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68919296 unmapped: 1294336 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68919296 unmapped: 1294336 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 876129 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68927488 unmapped: 1286144 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 70.500732422s of 70.510574341s, submitted: 3
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68935680 unmapped: 1277952 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68952064 unmapped: 1261568 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68960256 unmapped: 1253376 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68960256 unmapped: 1253376 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879153 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68968448 unmapped: 1245184 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68976640 unmapped: 1236992 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68976640 unmapped: 1236992 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68984832 unmapped: 1228800 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68984832 unmapped: 1228800 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878562 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68993024 unmapped: 1220608 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68993024 unmapped: 1220608 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 68993024 unmapped: 1220608 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69001216 unmapped: 1212416 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69001216 unmapped: 1212416 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878562 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69001216 unmapped: 1212416 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb2515e000 session 0x55cb222bcf00
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69009408 unmapped: 1204224 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69009408 unmapped: 1204224 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69025792 unmapped: 1187840 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69025792 unmapped: 1187840 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878562 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69025792 unmapped: 1187840 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69042176 unmapped: 1171456 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69042176 unmapped: 1171456 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69050368 unmapped: 1163264 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69050368 unmapped: 1163264 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 878562 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69050368 unmapped: 1163264 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69058560 unmapped: 1155072 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69058560 unmapped: 1155072 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69058560 unmapped: 1155072 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 29.021001816s of 29.033502579s, submitted: 3
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69074944 unmapped: 1138688 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881586 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69074944 unmapped: 1138688 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69066752 unmapped: 1146880 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69074944 unmapped: 1138688 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69074944 unmapped: 1138688 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69083136 unmapped: 1130496 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881586 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69083136 unmapped: 1130496 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69107712 unmapped: 1105920 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69107712 unmapped: 1105920 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69107712 unmapped: 1105920 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69115904 unmapped: 1097728 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880404 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69115904 unmapped: 1097728 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69132288 unmapped: 1081344 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69132288 unmapped: 1081344 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69140480 unmapped: 1073152 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69148672 unmapped: 1064960 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880404 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb2515f000 session 0x55cb22a503c0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69148672 unmapped: 1064960 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69156864 unmapped: 1056768 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69165056 unmapped: 1048576 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69165056 unmapped: 1048576 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69173248 unmapped: 1040384 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880404 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69173248 unmapped: 1040384 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69181440 unmapped: 1032192 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69181440 unmapped: 1032192 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69181440 unmapped: 1032192 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69197824 unmapped: 1015808 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880404 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69197824 unmapped: 1015808 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69197824 unmapped: 1015808 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69206016 unmapped: 1007616 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69214208 unmapped: 999424 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69222400 unmapped: 991232 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880404 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69222400 unmapped: 991232 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69222400 unmapped: 991232 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69230592 unmapped: 983040 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69230592 unmapped: 983040 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69230592 unmapped: 983040 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 35.879364014s of 35.892936707s, submitted: 4
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879813 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69238784 unmapped: 974848 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69246976 unmapped: 966656 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69246976 unmapped: 966656 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69255168 unmapped: 958464 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69255168 unmapped: 958464 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69263360 unmapped: 950272 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69271552 unmapped: 942080 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69271552 unmapped: 942080 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69287936 unmapped: 925696 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69287936 unmapped: 925696 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69287936 unmapped: 925696 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69296128 unmapped: 917504 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69296128 unmapped: 917504 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69312512 unmapped: 901120 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69312512 unmapped: 901120 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69312512 unmapped: 901120 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69328896 unmapped: 884736 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69328896 unmapped: 884736 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69337088 unmapped: 876544 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69337088 unmapped: 876544 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69337088 unmapped: 876544 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69361664 unmapped: 851968 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69361664 unmapped: 851968 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69378048 unmapped: 835584 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69386240 unmapped: 827392 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69394432 unmapped: 819200 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69394432 unmapped: 819200 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69402624 unmapped: 811008 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69402624 unmapped: 811008 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69410816 unmapped: 802816 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69410816 unmapped: 802816 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69410816 unmapped: 802816 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69419008 unmapped: 794624 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69427200 unmapped: 786432 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69435392 unmapped: 778240 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69443584 unmapped: 770048 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69451776 unmapped: 761856 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69459968 unmapped: 753664 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69468160 unmapped: 745472 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69468160 unmapped: 745472 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69476352 unmapped: 737280 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69476352 unmapped: 737280 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69476352 unmapped: 737280 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69484544 unmapped: 729088 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69484544 unmapped: 729088 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69492736 unmapped: 720896 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69492736 unmapped: 720896 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69492736 unmapped: 720896 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69500928 unmapped: 712704 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69500928 unmapped: 712704 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69500928 unmapped: 712704 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69525504 unmapped: 688128 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69525504 unmapped: 688128 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69541888 unmapped: 671744 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69550080 unmapped: 663552 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69558272 unmapped: 655360 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69574656 unmapped: 638976 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69574656 unmapped: 638976 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69582848 unmapped: 630784 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69582848 unmapped: 630784 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69582848 unmapped: 630784 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69591040 unmapped: 622592 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69591040 unmapped: 622592 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69591040 unmapped: 622592 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69599232 unmapped: 614400 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69599232 unmapped: 614400 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb229ed400 session 0x55cb24f36960
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69599232 unmapped: 614400 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69607424 unmapped: 606208 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69623808 unmapped: 589824 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69632000 unmapped: 581632 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69640192 unmapped: 573440 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69648384 unmapped: 565248 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69656576 unmapped: 557056 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69656576 unmapped: 557056 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69664768 unmapped: 548864 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 879222 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69664768 unmapped: 548864 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69672960 unmapped: 540672 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69681152 unmapped: 532480 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69681152 unmapped: 532480 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69681152 unmapped: 532480 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 79.674690247s of 79.682121277s, submitted: 2
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 880734 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69689344 unmapped: 524288 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69697536 unmapped: 516096 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69697536 unmapped: 516096 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69722112 unmapped: 491520 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69722112 unmapped: 491520 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69730304 unmapped: 483328 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69754880 unmapped: 458752 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69763072 unmapped: 450560 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69771264 unmapped: 442368 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69771264 unmapped: 442368 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69787648 unmapped: 425984 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69787648 unmapped: 425984 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69787648 unmapped: 425984 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69787648 unmapped: 425984 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69795840 unmapped: 417792 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69795840 unmapped: 417792 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69804032 unmapped: 409600 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69804032 unmapped: 409600 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69820416 unmapped: 393216 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69820416 unmapped: 393216 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69820416 unmapped: 393216 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69828608 unmapped: 385024 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69828608 unmapped: 385024 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69828608 unmapped: 385024 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69836800 unmapped: 376832 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69853184 unmapped: 360448 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69861376 unmapped: 352256 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69869568 unmapped: 344064 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69869568 unmapped: 344064 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69877760 unmapped: 335872 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69877760 unmapped: 335872 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69894144 unmapped: 319488 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69902336 unmapped: 311296 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69902336 unmapped: 311296 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69910528 unmapped: 303104 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69910528 unmapped: 303104 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69910528 unmapped: 303104 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69918720 unmapped: 294912 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69918720 unmapped: 294912 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69918720 unmapped: 294912 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69935104 unmapped: 278528 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69935104 unmapped: 278528 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69951488 unmapped: 262144 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69951488 unmapped: 262144 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69959680 unmapped: 253952 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69967872 unmapped: 245760 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69967872 unmapped: 245760 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69984256 unmapped: 229376 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69984256 unmapped: 229376 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 69984256 unmapped: 229376 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70000640 unmapped: 212992 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70008832 unmapped: 204800 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70017024 unmapped: 196608 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70017024 unmapped: 196608 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70017024 unmapped: 196608 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb252b1800 session 0x55cb24a60d20
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70017024 unmapped: 196608 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70025216 unmapped: 188416 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70025216 unmapped: 188416 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70033408 unmapped: 180224 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70033408 unmapped: 180224 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70041600 unmapped: 172032 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70049792 unmapped: 163840 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70057984 unmapped: 155648 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70066176 unmapped: 147456 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70066176 unmapped: 147456 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70074368 unmapped: 139264 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb222fb800 session 0x55cb23919860
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70082560 unmapped: 131072 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70082560 unmapped: 131072 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70090752 unmapped: 122880 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70090752 unmapped: 122880 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70090752 unmapped: 122880 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70098944 unmapped: 114688 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70098944 unmapped: 114688 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70098944 unmapped: 114688 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70107136 unmapped: 106496 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 882246 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70107136 unmapped: 106496 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70123520 unmapped: 90112 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70123520 unmapped: 90112 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70123520 unmapped: 90112 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70131712 unmapped: 81920 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 80.678237915s of 80.685699463s, submitted: 2
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 881655 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70148096 unmapped: 65536 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70148096 unmapped: 65536 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70164480 unmapped: 49152 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70172672 unmapped: 40960 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70172672 unmapped: 40960 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70172672 unmapped: 40960 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70180864 unmapped: 32768 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70180864 unmapped: 32768 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70189056 unmapped: 24576 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70189056 unmapped: 24576 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70197248 unmapped: 16384 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70197248 unmapped: 16384 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70197248 unmapped: 16384 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70205440 unmapped: 8192 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70205440 unmapped: 8192 heap: 70213632 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70221824 unmapped: 1040384 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70230016 unmapped: 1032192 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70238208 unmapped: 1024000 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70246400 unmapped: 1015808 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70246400 unmapped: 1015808 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70270976 unmapped: 991232 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70270976 unmapped: 991232 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70270976 unmapped: 991232 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70279168 unmapped: 983040 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70279168 unmapped: 983040 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70279168 unmapped: 983040 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70287360 unmapped: 974848 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70287360 unmapped: 974848 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70287360 unmapped: 974848 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70295552 unmapped: 966656 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70295552 unmapped: 966656 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70295552 unmapped: 966656 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70303744 unmapped: 958464 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70311936 unmapped: 950272 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70311936 unmapped: 950272 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70320128 unmapped: 942080 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70328320 unmapped: 933888 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70336512 unmapped: 925696 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70344704 unmapped: 917504 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70344704 unmapped: 917504 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70344704 unmapped: 917504 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70361088 unmapped: 901120 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70361088 unmapped: 901120 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70369280 unmapped: 892928 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70369280 unmapped: 892928 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70369280 unmapped: 892928 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70377472 unmapped: 884736 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70377472 unmapped: 884736 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70377472 unmapped: 884736 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70385664 unmapped: 876544 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70385664 unmapped: 876544 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70385664 unmapped: 876544 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70393856 unmapped: 868352 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70393856 unmapped: 868352 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70402048 unmapped: 860160 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70402048 unmapped: 860160 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70402048 unmapped: 860160 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70410240 unmapped: 851968 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70410240 unmapped: 851968 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70410240 unmapped: 851968 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70418432 unmapped: 843776 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70426624 unmapped: 835584 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70434816 unmapped: 827392 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70434816 unmapped: 827392 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70434816 unmapped: 827392 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70451200 unmapped: 811008 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70451200 unmapped: 811008 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70451200 unmapped: 811008 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70459392 unmapped: 802816 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70459392 unmapped: 802816 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70459392 unmapped: 802816 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70467584 unmapped: 794624 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70467584 unmapped: 794624 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 5546 writes, 24K keys, 5546 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 5546 writes, 880 syncs, 6.30 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5546 writes, 24K keys, 5546 commit groups, 1.0 writes per commit group, ingest: 18.97 MB, 0.03 MB/s#012Interval WAL: 5546 writes, 880 syncs, 6.30 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_s
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70533120 unmapped: 729088 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70533120 unmapped: 729088 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70533120 unmapped: 729088 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 720896 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70541312 unmapped: 720896 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 712704 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 712704 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb252b0400 session 0x55cb237c54a0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70549504 unmapped: 712704 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 704512 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 704512 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70557696 unmapped: 704512 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70565888 unmapped: 696320 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70574080 unmapped: 688128 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70582272 unmapped: 679936 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70582272 unmapped: 679936 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70582272 unmapped: 679936 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70590464 unmapped: 671744 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 884679 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70598656 unmapped: 663552 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70598656 unmapped: 663552 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70615040 unmapped: 647168 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70615040 unmapped: 647168 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 94.012550354s of 94.021888733s, submitted: 3
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70623232 unmapped: 638976 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886191 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70623232 unmapped: 638976 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70623232 unmapped: 638976 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70631424 unmapped: 630784 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70631424 unmapped: 630784 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70639616 unmapped: 622592 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 887112 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70647808 unmapped: 614400 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70647808 unmapped: 614400 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70656000 unmapped: 606208 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70656000 unmapped: 606208 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70664192 unmapped: 598016 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886521 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70672384 unmapped: 589824 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70672384 unmapped: 589824 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70680576 unmapped: 581632 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70680576 unmapped: 581632 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70680576 unmapped: 581632 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886521 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70688768 unmapped: 573440 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70688768 unmapped: 573440 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70696960 unmapped: 565248 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70696960 unmapped: 565248 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70696960 unmapped: 565248 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886521 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70705152 unmapped: 557056 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70705152 unmapped: 557056 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70705152 unmapped: 557056 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70713344 unmapped: 548864 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70713344 unmapped: 548864 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886521 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70721536 unmapped: 540672 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70721536 unmapped: 540672 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70721536 unmapped: 540672 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70729728 unmapped: 532480 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70729728 unmapped: 532480 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886521 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70729728 unmapped: 532480 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70737920 unmapped: 524288 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70737920 unmapped: 524288 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70746112 unmapped: 516096 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70754304 unmapped: 507904 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886521 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70770688 unmapped: 491520 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 483328 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 483328 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70778880 unmapped: 483328 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70787072 unmapped: 475136 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886521 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70787072 unmapped: 475136 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70787072 unmapped: 475136 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70795264 unmapped: 466944 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70795264 unmapped: 466944 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70803456 unmapped: 458752 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886521 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70811648 unmapped: 450560 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70819840 unmapped: 442368 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70819840 unmapped: 442368 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70819840 unmapped: 442368 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70819840 unmapped: 442368 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 51.020427704s of 51.034717560s, submitted: 4
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888033 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70828032 unmapped: 434176 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70828032 unmapped: 434176 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 70901760 unmapped: 360448 heap: 71262208 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 1286144 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 1286144 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886851 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 1286144 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 1286144 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 1286144 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 1286144 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 1286144 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886851 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 1286144 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 1286144 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb222fb800 session 0x55cb24a605a0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 1286144 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71024640 unmapped: 1286144 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71032832 unmapped: 1277952 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886851 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71032832 unmapped: 1277952 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71032832 unmapped: 1277952 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71032832 unmapped: 1277952 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71032832 unmapped: 1277952 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71032832 unmapped: 1277952 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886851 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71041024 unmapped: 1269760 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71041024 unmapped: 1269760 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71057408 unmapped: 1253376 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71057408 unmapped: 1253376 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71065600 unmapped: 1245184 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 886851 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 25.256875992s of 26.075702667s, submitted: 205
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71090176 unmapped: 1220608 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71090176 unmapped: 1220608 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 1212416 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71098368 unmapped: 1212416 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71114752 unmapped: 1196032 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889875 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71122944 unmapped: 1187840 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71139328 unmapped: 1171456 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71139328 unmapped: 1171456 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71139328 unmapped: 1171456 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71155712 unmapped: 1155072 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71172096 unmapped: 1138688 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71188480 unmapped: 1122304 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71213056 unmapped: 1097728 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71213056 unmapped: 1097728 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71221248 unmapped: 1089536 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 1081344 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1032192 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1032192 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1032192 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1032192 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71155712 unmapped: 1155072 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71155712 unmapped: 1155072 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71163904 unmapped: 1146880 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71163904 unmapped: 1146880 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71163904 unmapped: 1146880 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71172096 unmapped: 1138688 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71213056 unmapped: 1097728 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71213056 unmapped: 1097728 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 1081344 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 1081344 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 1081344 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 1081344 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 1081344 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 1081344 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb229ed400 session 0x55cb257c9860
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb252b0400 session 0x55cb257770e0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 999424 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 999424 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 999424 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 999424 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 983040 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 983040 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 193.200393677s of 193.215057373s, submitted: 3
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 983040 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 966656 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888693 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 966656 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71360512 unmapped: 950272 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71360512 unmapped: 950272 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71360512 unmapped: 950272 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71368704 unmapped: 942080 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890205 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71368704 unmapped: 942080 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71368704 unmapped: 942080 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71368704 unmapped: 942080 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890205 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890205 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 917504 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890205 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 917504 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 917504 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb23c79400 session 0x55cb24f1a3c0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 917504 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 909312 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 909312 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890205 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71409664 unmapped: 901120 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71409664 unmapped: 901120 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71409664 unmapped: 901120 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890205 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 34.551929474s of 34.587165833s, submitted: 2
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 891717 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 894150 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 894150 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb252b1800 session 0x55cb25234960
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 894150 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 894150 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71442432 unmapped: 868352 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71442432 unmapped: 868352 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71442432 unmapped: 868352 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 27.263769150s of 27.276128769s, submitted: 4
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897174 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897174 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895992 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 819200 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895992 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 819200 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 819200 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb22622400 session 0x55cb23c641e0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 819200 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 819200 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895992 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895992 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb222fb800 session 0x55cb252354a0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71516160 unmapped: 794624 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71516160 unmapped: 794624 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895992 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71516160 unmapped: 794624 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71516160 unmapped: 794624 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71516160 unmapped: 794624 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 786432 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 786432 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895992 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 786432 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 786432 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 786432 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 786432 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71540736 unmapped: 770048 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895992 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71540736 unmapped: 770048 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 41.978160858s of 41.991683960s, submitted: 4
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71540736 unmapped: 770048 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71540736 unmapped: 770048 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71548928 unmapped: 761856 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897504 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71565312 unmapped: 745472 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb252b1c00 session 0x55cb2397f4a0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 53.931362152s of 53.942428589s, submitted: 3
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897834 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897834 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897243 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897243 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897243 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.023803711s of 22.031444550s, submitted: 2
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71622656 unmapped: 688128 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896652 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71622656 unmapped: 688128 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:00.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71622656 unmapped: 688128 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71622656 unmapped: 688128 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896652 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896652 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb2515f000 session 0x55cb256783c0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896652 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896652 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896652 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 29.378730774s of 29.382411957s, submitted: 1
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898164 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898164 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898164 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb23c79400 session 0x55cb256ab0e0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898164 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898164 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898164 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71680000 unmapped: 630784 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 32.485904694s of 32.491119385s, submitted: 1
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899676 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899676 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread fragmentation_score=0.000024 took=0.000092s
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 598016 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 598016 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 598016 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 598016 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 598016 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Oct 10 06:23:00 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/356895798' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Oct 10 06:23:00 np0005479823 nova_compute[235775]: 2025-10-10 10:23:00.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 598016 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71729152 unmapped: 581632 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71729152 unmapped: 581632 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 5994 writes, 24K keys, 5994 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 5994 writes, 1097 syncs, 5.46 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 448 writes, 699 keys, 448 commit groups, 1.0 writes per commit group, ingest: 0.23 MB, 0.00 MB/s#012Interval WAL: 448 writes, 217 syncs, 2.06 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 524288 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 524288 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 524288 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 524288 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb23d67400 session 0x55cb257c9e00
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 116.097770691s of 116.122451782s, submitted: 2
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898494 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898494 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898494 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898494 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898494 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.223886490s of 22.228187561s, submitted: 1
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb222fb800 session 0x55cb25216f00
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71811072 unmapped: 499712 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 901518 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 1490944 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 901518 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 901518 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.055137634s of 15.676420212s, submitted: 212
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb252b0400 session 0x55cb257c9680
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904542 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 903951 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb23d67400 session 0x55cb24a952c0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 903360 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 903360 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 903360 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 903360 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 29.002698898s of 29.016599655s, submitted: 4
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75145216 unmapped: 311296 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75145216 unmapped: 311296 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75145216 unmapped: 311296 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75145216 unmapped: 311296 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb252b1c00 session 0x55cb24f1ab40
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 81.090309143s of 81.102882385s, submitted: 3
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 905793 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 905202 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 905202 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb23c79400 session 0x55cb25216000
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 905202 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 905202 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 28.442283630s of 28.465816498s, submitted: 2
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 908226 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 908226 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 907635 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 907635 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb23c79400 session 0x55cb252174a0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.102451324s of 16.118749619s, submitted: 3
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75218944 unmapped: 237568 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 144 handle_osd_map epochs [144,145], i have 144, src has [1,145]
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 145 handle_osd_map epochs [145,145], i have 145, src has [1,145]
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 146 handle_osd_map epochs [146,146], i have 146, src has [1,146]
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 146 ms_handle_reset con 0x55cb23d67400 session 0x55cb237c54a0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 146 handle_osd_map epochs [146,147], i have 146, src has [1,147]
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0xfdb10/0x1b2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fca64000/0x0/0x4ffc00000, data 0xffae2/0x1b5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926645 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 147 handle_osd_map epochs [147,148], i have 147, src has [1,148]
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 148 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 148 ms_handle_reset con 0x55cb252b1c00 session 0x55cb23dda000
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 65536 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca62000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca62000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929443 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca62000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.733337402s of 13.866815567s, submitted: 54
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930955 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca62000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931459 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca63000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930277 data_alloc: 218103808 data_used: 53248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca63000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930429 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca63000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930429 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca63000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca63000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930429 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca63000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 148 ms_handle_reset con 0x55cb2515f000 session 0x55cb256aa1e0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 29.409894943s of 29.425935745s, submitted: 4
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 148 ms_handle_reset con 0x55cb25754c00 session 0x55cb256aa5a0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 148 ms_handle_reset con 0x55cb23c79400 session 0x55cb25679e00
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77373440 unmapped: 180224 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930429 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77373440 unmapped: 180224 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 148 handle_osd_map epochs [148,149], i have 148, src has [1,149]
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77373440 unmapped: 1228800 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 149 handle_osd_map epochs [149,150], i have 149, src has [1,150]
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fca5f000/0x0/0x4ffc00000, data 0x103cdd/0x1bc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 150 ms_handle_reset con 0x55cb23d67400 session 0x55cb24a1e960
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 150 ms_handle_reset con 0x55cb2515f000 session 0x55cb25217860
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 150 ms_handle_reset con 0x55cb252b1c00 session 0x55cb250810e0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 150 ms_handle_reset con 0x55cb25755000 session 0x55cb23ddb860
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 150 ms_handle_reset con 0x55cb23c79400 session 0x55cb23ddb0e0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77832192 unmapped: 11272192 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77840384 unmapped: 11264000 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 150 handle_osd_map epochs [150,151], i have 150, src has [1,151]
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 78888960 unmapped: 10215424 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c4000/0x0/0x4ffc00000, data 0xa9ae51/0xb56000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1019665 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb24c6a1e0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 78938112 unmapped: 10166272 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 78938112 unmapped: 10166272 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f000 session 0x55cb25080d20
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 78938112 unmapped: 10166272 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 78938112 unmapped: 10166272 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c4000/0x0/0x4ffc00000, data 0xa9ae51/0xb56000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b1c00 session 0x55cb252a34a0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.058108330s of 10.286009789s, submitted: 78
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb257765a0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 78864384 unmapped: 10240000 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1018571 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 78864384 unmapped: 10240000 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c5000/0x0/0x4ffc00000, data 0xa9ae61/0xb57000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c5000/0x0/0x4ffc00000, data 0xa9ae61/0xb57000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 80191488 unmapped: 8912896 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c5000/0x0/0x4ffc00000, data 0xa9ae61/0xb57000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88702976 unmapped: 401408 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88702976 unmapped: 401408 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c5000/0x0/0x4ffc00000, data 0xa9ae61/0xb57000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1085887 data_alloc: 234881024 data_used: 10084352
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c5000/0x0/0x4ffc00000, data 0xa9ae61/0xb57000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1086640 data_alloc: 234881024 data_used: 10084352
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c5000/0x0/0x4ffc00000, data 0xa9ae61/0xb57000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.044337273s of 12.061671257s, submitted: 5
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 95715328 unmapped: 4505600 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 96894976 unmapped: 3325952 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97017856 unmapped: 3203072 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1210611 data_alloc: 234881024 data_used: 10915840
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97026048 unmapped: 3194880 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa060000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97026048 unmapped: 3194880 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97026048 unmapped: 3194880 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97026048 unmapped: 3194880 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97026048 unmapped: 3194880 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1210611 data_alloc: 234881024 data_used: 10915840
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97026048 unmapped: 3194880 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa060000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97083392 unmapped: 3137536 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97083392 unmapped: 3137536 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97083392 unmapped: 3137536 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97083392 unmapped: 3137536 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1211523 data_alloc: 234881024 data_used: 10985472
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa060000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97091584 unmapped: 3129344 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97091584 unmapped: 3129344 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa060000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97091584 unmapped: 3129344 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa060000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97091584 unmapped: 3129344 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97091584 unmapped: 3129344 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1211523 data_alloc: 234881024 data_used: 10985472
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97099776 unmapped: 3121152 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa060000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97099776 unmapped: 3121152 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97099776 unmapped: 3121152 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b1c00 session 0x55cb2397e000
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755800 session 0x55cb2397f2c0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755c00 session 0x55cb2397e960
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97099776 unmapped: 3121152 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.521245956s of 22.787330627s, submitted: 111
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515fc00 session 0x55cb24c590e0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb251034a0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa060000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 96673792 unmapped: 3547136 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1207099 data_alloc: 234881024 data_used: 10989568
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb25103680
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515fc00 session 0x55cb25102780
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b1c00 session 0x55cb24a1f860
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755800 session 0x55cb24a1ed20
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755c00 session 0x55cb24a1f0e0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97173504 unmapped: 16695296 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb22a681e0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97173504 unmapped: 16695296 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97173504 unmapped: 16695296 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb2397e3c0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9827000/0x0/0x4ffc00000, data 0x2197e71/0x2255000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9827000/0x0/0x4ffc00000, data 0x2197e71/0x2255000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97173504 unmapped: 16695296 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97173504 unmapped: 16695296 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1274581 data_alloc: 234881024 data_used: 10989568
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97173504 unmapped: 16695296 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9827000/0x0/0x4ffc00000, data 0x2197e71/0x2255000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97173504 unmapped: 16695296 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515fc00 session 0x55cb22a501e0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97165312 unmapped: 16703488 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b1c00 session 0x55cb222bcf00
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97165312 unmapped: 16703488 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97165312 unmapped: 16703488 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755800 session 0x55cb257765a0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.540281296s of 10.644290924s, submitted: 20
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb25776f00
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280396 data_alloc: 234881024 data_used: 10989568
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97419264 unmapped: 16449536 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97042432 unmapped: 16826368 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9801000/0x0/0x4ffc00000, data 0x21bbea4/0x227b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97042432 unmapped: 16826368 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9801000/0x0/0x4ffc00000, data 0x21bbea4/0x227b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 100958208 unmapped: 12910592 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9801000/0x0/0x4ffc00000, data 0x21bbea4/0x227b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105037824 unmapped: 8830976 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1337852 data_alloc: 234881024 data_used: 19390464
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105037824 unmapped: 8830976 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9801000/0x0/0x4ffc00000, data 0x21bbea4/0x227b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105070592 unmapped: 8798208 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105070592 unmapped: 8798208 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105070592 unmapped: 8798208 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9801000/0x0/0x4ffc00000, data 0x21bbea4/0x227b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105103360 unmapped: 8765440 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1339364 data_alloc: 234881024 data_used: 19390464
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.622159004s of 10.657759666s, submitted: 8
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105103360 unmapped: 8765440 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105103360 unmapped: 8765440 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9801000/0x0/0x4ffc00000, data 0x21bbea4/0x227b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9801000/0x0/0x4ffc00000, data 0x21bbea4/0x227b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105103360 unmapped: 8765440 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111075328 unmapped: 3850240 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111214592 unmapped: 3710976 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1423561 data_alloc: 234881024 data_used: 20221952
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8ead000/0x0/0x4ffc00000, data 0x2b07ea4/0x2bc7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111443968 unmapped: 3481600 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8ea5000/0x0/0x4ffc00000, data 0x2b0fea4/0x2bcf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111443968 unmapped: 3481600 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111640576 unmapped: 3284992 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111640576 unmapped: 3284992 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111648768 unmapped: 3276800 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8ea5000/0x0/0x4ffc00000, data 0x2b0fea4/0x2bcf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1423561 data_alloc: 234881024 data_used: 20221952
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111648768 unmapped: 3276800 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.157508850s of 10.318835258s, submitted: 79
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8ea5000/0x0/0x4ffc00000, data 0x2b0fea4/0x2bcf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111673344 unmapped: 3252224 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111673344 unmapped: 3252224 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb257774a0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515fc00 session 0x55cb22e632c0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111673344 unmapped: 3252224 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b1c00 session 0x55cb25235860
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 102645760 unmapped: 12279808 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220056 data_alloc: 234881024 data_used: 10858496
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 102645760 unmapped: 12279808 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa067000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 102645760 unmapped: 12279808 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa067000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 102645760 unmapped: 12279808 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 102645760 unmapped: 12279808 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb252165a0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23c79400 session 0x55cb251023c0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 102637568 unmapped: 12288000 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969360 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb23c64d20
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93929472 unmapped: 20996096 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 968651 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 968651 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 968651 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 968651 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 968651 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 34.769847870s of 34.920715332s, submitted: 64
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97558528 unmapped: 28000256 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb256aba40
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb256aa960
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa7c3000/0x0/0x4ffc00000, data 0x11fedef/0x12b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1091241 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa7c3000/0x0/0x4ffc00000, data 0x11fedef/0x12b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb256aad20
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1091241 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93896704 unmapped: 31662080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93896704 unmapped: 31662080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa7c3000/0x0/0x4ffc00000, data 0x11fedef/0x12b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99975168 unmapped: 25583616 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99975168 unmapped: 25583616 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99975168 unmapped: 25583616 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1201137 data_alloc: 234881024 data_used: 12959744
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99975168 unmapped: 25583616 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99975168 unmapped: 25583616 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa7c3000/0x0/0x4ffc00000, data 0x11fedef/0x12b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99983360 unmapped: 25575424 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99983360 unmapped: 25575424 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99983360 unmapped: 25575424 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1201137 data_alloc: 234881024 data_used: 12959744
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99991552 unmapped: 25567232 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 100007936 unmapped: 25550848 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 21.422227859s of 21.509000778s, submitted: 20
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106987520 unmapped: 18571264 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a14000/0x0/0x4ffc00000, data 0x1faddef/0x2068000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108470272 unmapped: 17088512 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a14000/0x0/0x4ffc00000, data 0x1faddef/0x2068000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108478464 unmapped: 17080320 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1319733 data_alloc: 234881024 data_used: 14139392
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108486656 unmapped: 17072128 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108486656 unmapped: 17072128 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f996e000/0x0/0x4ffc00000, data 0x2053def/0x210e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108486656 unmapped: 17072128 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f996e000/0x0/0x4ffc00000, data 0x2053def/0x210e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108486656 unmapped: 17072128 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107167744 unmapped: 18391040 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1311406 data_alloc: 234881024 data_used: 14139392
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107167744 unmapped: 18391040 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107167744 unmapped: 18391040 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f994a000/0x0/0x4ffc00000, data 0x2077def/0x2132000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107167744 unmapped: 18391040 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107175936 unmapped: 18382848 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.874601364s of 12.195711136s, submitted: 124
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108224512 unmapped: 17334272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1311150 data_alloc: 234881024 data_used: 14139392
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9944000/0x0/0x4ffc00000, data 0x207ddef/0x2138000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108232704 unmapped: 17326080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108232704 unmapped: 17326080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108232704 unmapped: 17326080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9944000/0x0/0x4ffc00000, data 0x207ddef/0x2138000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108232704 unmapped: 17326080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108232704 unmapped: 17326080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1311150 data_alloc: 234881024 data_used: 14139392
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24a605a0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108232704 unmapped: 17326080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108232704 unmapped: 17326080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108240896 unmapped: 17317888 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9941000/0x0/0x4ffc00000, data 0x2080def/0x213b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108240896 unmapped: 17317888 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108240896 unmapped: 17317888 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1311694 data_alloc: 234881024 data_used: 14151680
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108240896 unmapped: 17317888 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108240896 unmapped: 17317888 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108240896 unmapped: 17317888 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.061926842s of 14.079800606s, submitted: 4
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9930000/0x0/0x4ffc00000, data 0x2091def/0x214c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 17195008 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9930000/0x0/0x4ffc00000, data 0x2091def/0x214c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb25678f00
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108371968 unmapped: 17186816 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 982200 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23c79400 session 0x55cb25824b40
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa95b000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98844672 unmapped: 26714112 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98254848 unmapped: 27303936 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa95b000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98254848 unmapped: 27303936 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98254848 unmapped: 27303936 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa95b000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983712 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984032 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984032 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984032 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984032 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb24f1b4a0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515fc00 session 0x55cb24f1ad20
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24f1a3c0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23c79400 session 0x55cb24f1a780
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 28.527706146s of 28.564867020s, submitted: 20
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 100900864 unmapped: 24657920 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb24f1a1e0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98394112 unmapped: 30842880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98394112 unmapped: 30842880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98394112 unmapped: 30842880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065364 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb249925a0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98394112 unmapped: 30842880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b1c00 session 0x55cb24992f00
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98213888 unmapped: 31023104 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24c6be00
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23c79400 session 0x55cb24c6a1e0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faea5000/0x0/0x4ffc00000, data 0xb1cdef/0xbd7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97878016 unmapped: 31358976 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97878016 unmapped: 31358976 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97427456 unmapped: 31809536 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1092582 data_alloc: 218103808 data_used: 3399680
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99352576 unmapped: 29884416 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99434496 unmapped: 29802496 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fae80000/0x0/0x4ffc00000, data 0xb40dff/0xbfc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99467264 unmapped: 29769728 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fae80000/0x0/0x4ffc00000, data 0xb40dff/0xbfc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99500032 unmapped: 29736960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99500032 unmapped: 29736960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1131798 data_alloc: 234881024 data_used: 9220096
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99500032 unmapped: 29736960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99500032 unmapped: 29736960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fae80000/0x0/0x4ffc00000, data 0xb40dff/0xbfc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99500032 unmapped: 29736960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99500032 unmapped: 29736960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fae80000/0x0/0x4ffc00000, data 0xb40dff/0xbfc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fae80000/0x0/0x4ffc00000, data 0xb40dff/0xbfc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99500032 unmapped: 29736960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1133926 data_alloc: 234881024 data_used: 9277440
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.591819763s of 18.735073090s, submitted: 24
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fae80000/0x0/0x4ffc00000, data 0xb40dff/0xbfc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107536384 unmapped: 21700608 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107692032 unmapped: 21544960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107692032 unmapped: 21544960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107692032 unmapped: 21544960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107692032 unmapped: 21544960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1232606 data_alloc: 234881024 data_used: 9793536
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9f04000/0x0/0x4ffc00000, data 0x16abdff/0x1767000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9f04000/0x0/0x4ffc00000, data 0x16abdff/0x1767000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107724800 unmapped: 21512192 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9f04000/0x0/0x4ffc00000, data 0x16abdff/0x1767000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106569728 unmapped: 22667264 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ee1000/0x0/0x4ffc00000, data 0x16cfdff/0x178b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106586112 unmapped: 22650880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25754800 session 0x55cb25081680
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106586112 unmapped: 22650880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106586112 unmapped: 22650880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1230270 data_alloc: 234881024 data_used: 9854976
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106586112 unmapped: 22650880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ee1000/0x0/0x4ffc00000, data 0x16cfdff/0x178b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106586112 unmapped: 22650880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.891391754s of 12.132454872s, submitted: 125
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106610688 unmapped: 22626304 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ed7000/0x0/0x4ffc00000, data 0x16d9dff/0x1795000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106610688 unmapped: 22626304 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106610688 unmapped: 22626304 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1230430 data_alloc: 234881024 data_used: 9854976
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106610688 unmapped: 22626304 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106610688 unmapped: 22626304 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515e400 session 0x55cb24f1ba40
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106610688 unmapped: 22626304 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515ec00 session 0x55cb24a1bc20
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106627072 unmapped: 22609920 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb222bcf00
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f97ba000/0x0/0x4ffc00000, data 0x1df6dff/0x1eb2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107798528 unmapped: 21438464 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1296232 data_alloc: 234881024 data_used: 9854976
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9630000/0x0/0x4ffc00000, data 0x1f80dff/0x203c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 21405696 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 21405696 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 21405696 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 21405696 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.719927788s of 11.824364662s, submitted: 25
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23c79400 session 0x55cb24a1e960
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f962d000/0x0/0x4ffc00000, data 0x1f83dff/0x203f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107929600 unmapped: 21307392 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1300762 data_alloc: 234881024 data_used: 9854976
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107921408 unmapped: 21315584 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 14811136 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 14811136 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 14811136 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 14811136 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1360346 data_alloc: 234881024 data_used: 18640896
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f962c000/0x0/0x4ffc00000, data 0x1f83e22/0x2040000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114458624 unmapped: 14778368 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f962c000/0x0/0x4ffc00000, data 0x1f83e22/0x2040000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 14745600 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114565120 unmapped: 14671872 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb2397f2c0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114565120 unmapped: 14671872 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114565120 unmapped: 14671872 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1361578 data_alloc: 234881024 data_used: 18644992
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9629000/0x0/0x4ffc00000, data 0x1f84e22/0x2041000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114565120 unmapped: 14671872 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.852634430s of 11.898006439s, submitted: 19
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122814464 unmapped: 6422528 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 7430144 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8664000/0x0/0x4ffc00000, data 0x2f4be22/0x3008000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 7430144 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 7430144 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1493940 data_alloc: 234881024 data_used: 20471808
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8664000/0x0/0x4ffc00000, data 0x2f4be22/0x3008000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 7397376 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8664000/0x0/0x4ffc00000, data 0x2f4be22/0x3008000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 7397376 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 7389184 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb237c5a40
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515e400 session 0x55cb252350e0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 7389184 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb23c1e5a0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245624 data_alloc: 234881024 data_used: 9854976
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ed2000/0x0/0x4ffc00000, data 0x16dddff/0x1799000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ed2000/0x0/0x4ffc00000, data 0x16dddff/0x1799000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ed2000/0x0/0x4ffc00000, data 0x16dddff/0x1799000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245624 data_alloc: 234881024 data_used: 9854976
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ed2000/0x0/0x4ffc00000, data 0x16dddff/0x1799000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ed2000/0x0/0x4ffc00000, data 0x16dddff/0x1799000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb24c6ad20
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb257163c0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.070486069s of 16.434377670s, submitted: 164
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb237c5c20
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1014831 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1014831 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1014831 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1014831 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1014831 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 26.723497391s of 26.753862381s, submitted: 18
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb239192c0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb23c1fe00
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb252a21e0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb250814a0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb257c8d20
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106659840 unmapped: 34717696 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1118589 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106659840 unmapped: 34717696 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106659840 unmapped: 34717696 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106659840 unmapped: 34717696 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa704000/0x0/0x4ffc00000, data 0xeace51/0xf68000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106659840 unmapped: 34717696 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb2397e1e0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb2397e3c0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106676224 unmapped: 34701312 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa704000/0x0/0x4ffc00000, data 0xeace51/0xf68000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1118589 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb25678f00
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb257774a0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106135552 unmapped: 35241984 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa6de000/0x0/0x4ffc00000, data 0xed0e84/0xf8e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106135552 unmapped: 35241984 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105660416 unmapped: 35717120 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa6de000/0x0/0x4ffc00000, data 0xed0e84/0xf8e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220197 data_alloc: 234881024 data_used: 14155776
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa6de000/0x0/0x4ffc00000, data 0xed0e84/0xf8e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220197 data_alloc: 234881024 data_used: 14155776
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa6de000/0x0/0x4ffc00000, data 0xed0e84/0xf8e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.917535782s of 19.063180923s, submitted: 58
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118210560 unmapped: 23166976 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117637120 unmapped: 23740416 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ab4000/0x0/0x4ffc00000, data 0x1afae84/0x1bb8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1331335 data_alloc: 234881024 data_used: 14376960
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117637120 unmapped: 23740416 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117637120 unmapped: 23740416 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117637120 unmapped: 23740416 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117637120 unmapped: 23740416 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9aa6000/0x0/0x4ffc00000, data 0x1b08e84/0x1bc6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9aa6000/0x0/0x4ffc00000, data 0x1b08e84/0x1bc6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117637120 unmapped: 23740416 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1331335 data_alloc: 234881024 data_used: 14376960
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a82000/0x0/0x4ffc00000, data 0x1b2ce84/0x1bea000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.0 total, 600.0 interval#012Cumulative writes: 8276 writes, 33K keys, 8276 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s#012Cumulative WAL: 8276 writes, 2019 syncs, 4.10 writes per sync, written: 0.03 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2282 writes, 8748 keys, 2282 commit groups, 1.0 writes per commit group, ingest: 10.36 MB, 0.02 MB/s#012Interval WAL: 2282 writes, 922 syncs, 2.48 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a82000/0x0/0x4ffc00000, data 0x1b2ce84/0x1bea000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1326463 data_alloc: 234881024 data_used: 14381056
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.047369957s of 13.274172783s, submitted: 111
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a7c000/0x0/0x4ffc00000, data 0x1b32e84/0x1bf0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a7c000/0x0/0x4ffc00000, data 0x1b32e84/0x1bf0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1326375 data_alloc: 234881024 data_used: 14381056
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a6c000/0x0/0x4ffc00000, data 0x1b42e84/0x1c00000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 23633920 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 23633920 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1327543 data_alloc: 234881024 data_used: 14389248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 23633920 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 23633920 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a6c000/0x0/0x4ffc00000, data 0x1b42e84/0x1c00000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 23633920 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 23633920 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb24a1f860
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d02800 session 0x55cb250803c0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb25716b40
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb24f1ab40
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.387310028s of 13.403597832s, submitted: 5
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb24f1b680
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb24c6b680
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23944400 session 0x55cb24c6a960
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb25217860
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118210560 unmapped: 23166976 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb24a61c20
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1397910 data_alloc: 234881024 data_used: 14389248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f91b1000/0x0/0x4ffc00000, data 0x23fbef6/0x24bb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118210560 unmapped: 23166976 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f91b1000/0x0/0x4ffc00000, data 0x23fbef6/0x24bb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118210560 unmapped: 23166976 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118210560 unmapped: 23166976 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118210560 unmapped: 23166976 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23944400 session 0x55cb257c9680
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118538240 unmapped: 22839296 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1402283 data_alloc: 234881024 data_used: 14389248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f918b000/0x0/0x4ffc00000, data 0x2420ef6/0x24e0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118554624 unmapped: 22822912 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121602048 unmapped: 19775488 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126173184 unmapped: 15204352 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126173184 unmapped: 15204352 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f918b000/0x0/0x4ffc00000, data 0x2420ef6/0x24e0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126263296 unmapped: 15114240 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1463991 data_alloc: 234881024 data_used: 23457792
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126263296 unmapped: 15114240 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126263296 unmapped: 15114240 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9188000/0x0/0x4ffc00000, data 0x2424ef6/0x24e4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126263296 unmapped: 15114240 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126263296 unmapped: 15114240 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126263296 unmapped: 15114240 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1463991 data_alloc: 234881024 data_used: 23457792
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.817728996s of 15.977606773s, submitted: 48
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126337024 unmapped: 15040512 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130342912 unmapped: 11034624 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131219456 unmapped: 10158080 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8654000/0x0/0x4ffc00000, data 0x2f58ef6/0x3018000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131219456 unmapped: 10158080 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f863e000/0x0/0x4ffc00000, data 0x2f6eef6/0x302e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131252224 unmapped: 10125312 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1564155 data_alloc: 234881024 data_used: 24436736
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131293184 unmapped: 10084352 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f863b000/0x0/0x4ffc00000, data 0x2f71ef6/0x3031000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1560275 data_alloc: 234881024 data_used: 24440832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f863b000/0x0/0x4ffc00000, data 0x2f71ef6/0x3031000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.395318985s of 12.603665352s, submitted: 111
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130990080 unmapped: 10387456 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8635000/0x0/0x4ffc00000, data 0x2f77ef6/0x3037000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130990080 unmapped: 10387456 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1560555 data_alloc: 234881024 data_used: 24440832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130990080 unmapped: 10387456 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8635000/0x0/0x4ffc00000, data 0x2f77ef6/0x3037000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130998272 unmapped: 10379264 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130998272 unmapped: 10379264 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130998272 unmapped: 10379264 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8632000/0x0/0x4ffc00000, data 0x2f7aef6/0x303a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131039232 unmapped: 10338304 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1560411 data_alloc: 234881024 data_used: 24440832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131039232 unmapped: 10338304 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131039232 unmapped: 10338304 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131039232 unmapped: 10338304 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8632000/0x0/0x4ffc00000, data 0x2f7aef6/0x303a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8632000/0x0/0x4ffc00000, data 0x2f7aef6/0x303a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131072000 unmapped: 10305536 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131072000 unmapped: 10305536 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.019625664s of 12.032996178s, submitted: 5
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1561307 data_alloc: 234881024 data_used: 24440832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131096576 unmapped: 10280960 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131096576 unmapped: 10280960 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131096576 unmapped: 10280960 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131096576 unmapped: 10280960 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8629000/0x0/0x4ffc00000, data 0x2f80ef6/0x3040000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 10272768 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1560963 data_alloc: 234881024 data_used: 24440832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 10272768 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 10272768 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 10272768 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 10272768 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 10272768 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1561267 data_alloc: 234881024 data_used: 24440832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8627000/0x0/0x4ffc00000, data 0x2f84ef6/0x3044000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.661789894s of 10.690342903s, submitted: 10
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131137536 unmapped: 10240000 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8625000/0x0/0x4ffc00000, data 0x2f87ef6/0x3047000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131170304 unmapped: 10207232 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131178496 unmapped: 10199040 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb229ed400 session 0x55cb2397fe00
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131178496 unmapped: 10199040 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8625000/0x0/0x4ffc00000, data 0x2f87ef6/0x3047000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131178496 unmapped: 10199040 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8625000/0x0/0x4ffc00000, data 0x2f87ef6/0x3047000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1561187 data_alloc: 234881024 data_used: 24440832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130637824 unmapped: 10739712 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130637824 unmapped: 10739712 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130646016 unmapped: 10731520 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f861f000/0x0/0x4ffc00000, data 0x2f8def6/0x304d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130727936 unmapped: 10649600 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130924544 unmapped: 10452992 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1560899 data_alloc: 234881024 data_used: 24440832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130924544 unmapped: 10452992 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130932736 unmapped: 10444800 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.915495872s of 11.539477348s, submitted: 233
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130940928 unmapped: 10436608 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f861c000/0x0/0x4ffc00000, data 0x2f90ef6/0x3050000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130940928 unmapped: 10436608 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb24b1d680
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb256ab2c0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f861c000/0x0/0x4ffc00000, data 0x2f90ef6/0x3050000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb23dda3c0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1344411 data_alloc: 234881024 data_used: 14389248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f93d1000/0x0/0x4ffc00000, data 0x1b92e84/0x1c50000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 18456576 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 18456576 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 18456576 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f93f2000/0x0/0x4ffc00000, data 0x1b71e84/0x1c2f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 18456576 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 18456576 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1345059 data_alloc: 234881024 data_used: 14389248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f93f2000/0x0/0x4ffc00000, data 0x1b71e84/0x1c2f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 18456576 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 18456576 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb24a1ed20
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb25102b40
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a3b000/0x0/0x4ffc00000, data 0x1b71e84/0x1c2f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.092136383s of 10.210209846s, submitted: 50
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb25102780
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051980 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051980 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051980 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051980 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051980 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114302976 unmapped: 27074560 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114302976 unmapped: 27074560 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114302976 unmapped: 27074560 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114302976 unmapped: 27074560 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23944400 session 0x55cb24f36f00
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24f37860
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb24f363c0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb24f374a0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 26.982812881s of 27.189233780s, submitted: 65
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb22e62960
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb257770e0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111869952 unmapped: 33710080 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faaac000/0x0/0x4ffc00000, data 0xb04e51/0xbc0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1129727 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111869952 unmapped: 33710080 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faaac000/0x0/0x4ffc00000, data 0xb04e51/0xbc0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb257761e0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111869952 unmapped: 33710080 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb257774a0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb25776d20
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111869952 unmapped: 33710080 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb257763c0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111886336 unmapped: 33693696 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111886336 unmapped: 33693696 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1166738 data_alloc: 218103808 data_used: 5058560
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 113631232 unmapped: 31948800 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 113631232 unmapped: 31948800 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faaab000/0x0/0x4ffc00000, data 0xb04e61/0xbc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 113631232 unmapped: 31948800 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 113639424 unmapped: 31940608 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb22e62d20
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.018430710s of 10.130161285s, submitted: 41
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb25080780
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faaab000/0x0/0x4ffc00000, data 0xb04e61/0xbc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 35414016 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb25717a40
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1059699 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1059699 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1059699 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb25716f00
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb24b1dc20
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb24b1c960
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb23c1e5a0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.854639053s of 12.951243401s, submitted: 33
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb23c1f4a0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb249781e0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb24a61860
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb257c9c20
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb257c85a0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110026752 unmapped: 39755776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110026752 unmapped: 39755776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110026752 unmapped: 39755776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1126674 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110026752 unmapped: 39755776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110026752 unmapped: 39755776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb257c8000
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb24a601e0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110026752 unmapped: 39755776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fab9e000/0x0/0x4ffc00000, data 0xa13def/0xace000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb24a605a0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb250e1800 session 0x55cb257770e0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110034944 unmapped: 39747584 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110034944 unmapped: 39747584 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: mgrc ms_handle_reset ms_handle_reset con 0x55cb22623000
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/194506248
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/194506248,v1:192.168.122.100:6801/194506248]
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: mgrc handle_mgr_configure stats_period=5
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1128267 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 39682048 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 113369088 unmapped: 36413440 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fab9e000/0x0/0x4ffc00000, data 0xa13def/0xace000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114548736 unmapped: 35233792 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114548736 unmapped: 35233792 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114548736 unmapped: 35233792 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192259 data_alloc: 234881024 data_used: 9543680
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114548736 unmapped: 35233792 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.485980988s of 13.640996933s, submitted: 24
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb250e1800 session 0x55cb25776d20
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb257772c0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb252350e0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065573 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065573 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065573 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065573 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb22a51c20
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb22a503c0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb22a50000
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb222bcf00
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 23.271076202s of 23.354894638s, submitted: 36
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb25080780
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb250e1800 session 0x55cb24b1d680
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb24979a40
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb24f37e00
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb2397f2c0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156560 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faa6e000/0x0/0x4ffc00000, data 0xb42dff/0xbfe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156560 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faa6e000/0x0/0x4ffc00000, data 0xb42dff/0xbfe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb23d3e960
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111591424 unmapped: 38191104 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faa4a000/0x0/0x4ffc00000, data 0xb66dff/0xc22000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111591424 unmapped: 38191104 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faa4a000/0x0/0x4ffc00000, data 0xb66dff/0xc22000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182488 data_alloc: 218103808 data_used: 3469312
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faa4a000/0x0/0x4ffc00000, data 0xb66dff/0xc22000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111968256 unmapped: 37814272 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faa4a000/0x0/0x4ffc00000, data 0xb66dff/0xc22000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1221704 data_alloc: 234881024 data_used: 9289728
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faa4a000/0x0/0x4ffc00000, data 0xb66dff/0xc22000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.874525070s of 20.991596222s, submitted: 39
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1303724 data_alloc: 234881024 data_used: 9342976
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123199488 unmapped: 26583040 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9cc0000/0x0/0x4ffc00000, data 0x18f0dff/0x19ac000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 26238976 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 26238976 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 26238976 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 26238976 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1373710 data_alloc: 234881024 data_used: 10682368
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 26238976 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 26238976 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f972d000/0x0/0x4ffc00000, data 0x1e83dff/0x1f3f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 26116096 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 26116096 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 26116096 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1375014 data_alloc: 234881024 data_used: 10694656
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 26116096 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 26116096 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f970f000/0x0/0x4ffc00000, data 0x1ea1dff/0x1f5d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 26116096 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 26116096 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.182755470s of 14.495874405s, submitted: 173
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123781120 unmapped: 26001408 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb23d3f0e0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb250e6000 session 0x55cb257163c0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1374934 data_alloc: 234881024 data_used: 10694656
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb250e6000 session 0x55cb2397e960
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1084648 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1084648 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1084648 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1084648 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1084648 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb237c5c20
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb2397e000
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb23c1f0e0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb257772c0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 25.957300186s of 26.056312561s, submitted: 38
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb22e63680
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24a95860
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb24a1ed20
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22b01000 session 0x55cb22a68d20
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb239183c0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 115048448 unmapped: 46800896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 115048448 unmapped: 46800896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 115048448 unmapped: 46800896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa2f1000/0x0/0x4ffc00000, data 0x12bfdff/0x137b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 115048448 unmapped: 46800896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa2f1000/0x0/0x4ffc00000, data 0x12bfdff/0x137b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 115048448 unmapped: 46800896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb24a1a1e0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1217690 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 115064832 unmapped: 46784512 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 43900928 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 36888576 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa2f1000/0x0/0x4ffc00000, data 0x12bfdff/0x137b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 36888576 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 36888576 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1334994 data_alloc: 234881024 data_used: 17313792
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 36888576 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 36888576 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa2f1000/0x0/0x4ffc00000, data 0x12bfdff/0x137b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 36855808 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 36855808 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa2f1000/0x0/0x4ffc00000, data 0x12bfdff/0x137b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125026304 unmapped: 36823040 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1334994 data_alloc: 234881024 data_used: 17313792
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125034496 unmapped: 36814848 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.868372917s of 16.010391235s, submitted: 40
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130957312 unmapped: 30892032 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f93b2000/0x0/0x4ffc00000, data 0x1deddff/0x1ea9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131473408 unmapped: 30375936 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e800 session 0x55cb258243c0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e000 session 0x55cb25825e00
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e400 session 0x55cb258252c0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816fc00 session 0x55cb258250e0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb25824f00
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131596288 unmapped: 30253056 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131678208 unmapped: 30171136 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1516856 data_alloc: 234881024 data_used: 18149376
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131678208 unmapped: 30171136 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e000 session 0x55cb24f374a0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131678208 unmapped: 30171136 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8737000/0x0/0x4ffc00000, data 0x2a68dff/0x2b24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e400 session 0x55cb24f361e0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131678208 unmapped: 30171136 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131678208 unmapped: 30171136 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e800 session 0x55cb24f37680
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816f000 session 0x55cb24f37a40
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131702784 unmapped: 30146560 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1519196 data_alloc: 234881024 data_used: 18149376
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131702784 unmapped: 30146560 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8712000/0x0/0x4ffc00000, data 0x2a8ce32/0x2b4a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 133881856 unmapped: 27967488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140607488 unmapped: 21241856 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.067866325s of 12.396329880s, submitted: 138
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f870f000/0x0/0x4ffc00000, data 0x2a8fe32/0x2b4d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1595808 data_alloc: 251658240 data_used: 29470720
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f870f000/0x0/0x4ffc00000, data 0x2a8fe32/0x2b4d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1595808 data_alloc: 251658240 data_used: 29470720
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140648448 unmapped: 21200896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143187968 unmapped: 18661376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f7d06000/0x0/0x4ffc00000, data 0x3498e32/0x3556000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.850776672s of 10.000297546s, submitted: 56
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143392768 unmapped: 18456576 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143532032 unmapped: 18317312 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1680754 data_alloc: 251658240 data_used: 29532160
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143532032 unmapped: 18317312 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143532032 unmapped: 18317312 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143532032 unmapped: 18317312 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143532032 unmapped: 18317312 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f7cd8000/0x0/0x4ffc00000, data 0x34c5e32/0x3583000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f7cd8000/0x0/0x4ffc00000, data 0x34c5e32/0x3583000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143597568 unmapped: 18251776 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1679274 data_alloc: 251658240 data_used: 29532160
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143663104 unmapped: 18186240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143663104 unmapped: 18186240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143663104 unmapped: 18186240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f7cd4000/0x0/0x4ffc00000, data 0x34c9e32/0x3587000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143671296 unmapped: 18178048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143671296 unmapped: 18178048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1679746 data_alloc: 251658240 data_used: 29532160
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143671296 unmapped: 18178048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143671296 unmapped: 18178048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143671296 unmapped: 18178048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f7cd4000/0x0/0x4ffc00000, data 0x34c9e32/0x3587000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143671296 unmapped: 18178048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143671296 unmapped: 18178048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1679746 data_alloc: 251658240 data_used: 29532160
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f7cd4000/0x0/0x4ffc00000, data 0x34c9e32/0x3587000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.070764542s of 17.136646271s, submitted: 20
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143728640 unmapped: 18120704 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143728640 unmapped: 18120704 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb257c94a0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e000 session 0x55cb25678960
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 136912896 unmapped: 24936448 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e400 session 0x55cb22a51c20
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 136937472 unmapped: 24911872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f92e4000/0x0/0x4ffc00000, data 0x1ebbdff/0x1f77000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 136937472 unmapped: 24911872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1445716 data_alloc: 234881024 data_used: 18149376
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 136937472 unmapped: 24911872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 136937472 unmapped: 24911872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f92e4000/0x0/0x4ffc00000, data 0x1ebbdff/0x1f77000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22b01000 session 0x55cb252a25a0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24a1be00
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 136953856 unmapped: 24895488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24992960
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114672 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb09a000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb09a000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114672 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb09a000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114672 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb09a000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114672 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb09a000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114672 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb09a000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22b01000 session 0x55cb24b1c960
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb24c6a1e0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123912192 unmapped: 37937152 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e000 session 0x55cb24c6ba40
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e400 session 0x55cb24a941e0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 33.266139984s of 33.476127625s, submitted: 90
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24a95c20
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22b01000 session 0x55cb257c8d20
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb22a51e00
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e000 session 0x55cb25717a40
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e800 session 0x55cb25717860
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162264 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fab77000/0x0/0x4ffc00000, data 0x628e61/0x6e5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162264 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24a60000
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fab77000/0x0/0x4ffc00000, data 0x628e61/0x6e5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1197892 data_alloc: 218103808 data_used: 5349376
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fab77000/0x0/0x4ffc00000, data 0x628e61/0x6e5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 39362560 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 39362560 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1197892 data_alloc: 218103808 data_used: 5349376
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 39362560 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 39362560 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fab77000/0x0/0x4ffc00000, data 0x628e61/0x6e5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.718759537s of 18.834480286s, submitted: 44
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125886464 unmapped: 35962880 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129695744 unmapped: 32153600 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129949696 unmapped: 31899648 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1266056 data_alloc: 218103808 data_used: 6639616
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129949696 unmapped: 31899648 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129949696 unmapped: 31899648 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9254000/0x0/0x4ffc00000, data 0xdabe61/0xe68000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 31891456 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 31891456 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 31891456 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1266056 data_alloc: 218103808 data_used: 6639616
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 31891456 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 31891456 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9254000/0x0/0x4ffc00000, data 0xdabe61/0xe68000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 31891456 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.704682350s of 10.921176910s, submitted: 89
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22b01000 session 0x55cb24a612c0
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129826816 unmapped: 32022528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb25678960
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125779968 unmapped: 36069376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125779968 unmapped: 36069376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125779968 unmapped: 36069376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125779968 unmapped: 36069376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125779968 unmapped: 36069376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125779968 unmapped: 36069376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125788160 unmapped: 36061184 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125788160 unmapped: 36061184 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125788160 unmapped: 36061184 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125788160 unmapped: 36061184 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: do_command 'config diff' '{prefix=config diff}'
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125870080 unmapped: 35979264 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: do_command 'config show' '{prefix=config show}'
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: do_command 'counter dump' '{prefix=counter dump}'
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: do_command 'counter schema' '{prefix=counter schema}'
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125419520 unmapped: 36429824 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125558784 unmapped: 36290560 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:23:00 np0005479823 ceph-osd[77423]: do_command 'log dump' '{prefix=log dump}'
Oct 10 06:23:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Oct 10 06:23:00 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/320126284' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Oct 10 06:23:01 np0005479823 nova_compute[235775]: 2025-10-10 10:23:01.085 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:23:01 np0005479823 nova_compute[235775]: 2025-10-10 10:23:01.086 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct 10 06:23:01 np0005479823 nova_compute[235775]: 2025-10-10 10:23:01.086 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct 10 06:23:01 np0005479823 nova_compute[235775]: 2025-10-10 10:23:01.116 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct 10 06:23:01 np0005479823 nova_compute[235775]: 2025-10-10 10:23:01.116 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:23:01 np0005479823 nova_compute[235775]: 2025-10-10 10:23:01.116 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:23:01 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Oct 10 06:23:01 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/127520907' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Oct 10 06:23:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:23:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:01.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:23:01 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Oct 10 06:23:01 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/196607994' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Oct 10 06:23:01 np0005479823 nova_compute[235775]: 2025-10-10 10:23:01.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:23:01 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Oct 10 06:23:01 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1623996306' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Oct 10 06:23:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:02 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Oct 10 06:23:02 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1196706439' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Oct 10 06:23:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:02.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:02 np0005479823 nova_compute[235775]: 2025-10-10 10:23:02.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:23:02 np0005479823 nova_compute[235775]: 2025-10-10 10:23:02.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct 10 06:23:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:03 np0005479823 nova_compute[235775]: 2025-10-10 10:23:03.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:23:03 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Oct 10 06:23:03 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/563292797' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Oct 10 06:23:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:03.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:03 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Oct 10 06:23:03 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/781763434' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Oct 10 06:23:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:04 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Oct 10 06:23:04 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2622965518' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Oct 10 06:23:04 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Oct 10 06:23:04 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/54031956' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Oct 10 06:23:04 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Oct 10 06:23:04 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1788666203' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Oct 10 06:23:04 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Oct 10 06:23:04 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1918956957' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Oct 10 06:23:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:04.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:04 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Oct 10 06:23:04 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/983784123' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Oct 10 06:23:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:04 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Oct 10 06:23:04 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1496940488' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Oct 10 06:23:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:23:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:23:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:23:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:05 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:23:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Oct 10 06:23:05 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3249578032' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Oct 10 06:23:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:23:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:23:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:05.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:23:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Oct 10 06:23:05 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1692194911' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Oct 10 06:23:05 np0005479823 nova_compute[235775]: 2025-10-10 10:23:05.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:23:05 np0005479823 systemd[1]: Starting Hostname Service...
Oct 10 06:23:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Oct 10 06:23:05 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1245277696' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Oct 10 06:23:06 np0005479823 systemd[1]: Started Hostname Service.
Oct 10 06:23:06 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Oct 10 06:23:06 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/362286490' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Oct 10 06:23:06 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Oct 10 06:23:06 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3035571500' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Oct 10 06:23:06 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Oct 10 06:23:06 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/330407203' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Oct 10 06:23:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:06.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:06 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Oct 10 06:23:06 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/341481678' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Oct 10 06:23:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:07 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Oct 10 06:23:07 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2358813671' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Oct 10 06:23:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:23:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:07.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:23:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:08 np0005479823 nova_compute[235775]: 2025-10-10 10:23:08.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:23:08 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Oct 10 06:23:08 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/311730347' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Oct 10 06:23:08 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Oct 10 06:23:08 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2112620568' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Oct 10 06:23:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:23:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:08.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:23:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:09 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Oct 10 06:23:09 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1170104191' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Oct 10 06:23:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:09.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:09 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Oct 10 06:23:09 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4079194258' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Oct 10 06:23:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:09 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct 10 06:23:09 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct 10 06:23:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:23:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:23:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:23:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:23:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:23:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0)
Oct 10 06:23:10 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1651511276' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Oct 10 06:23:10 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct 10 06:23:10 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct 10 06:23:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:10.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:10 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct 10 06:23:10 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct 10 06:23:10 np0005479823 nova_compute[235775]: 2025-10-10 10:23:10.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:23:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:11 np0005479823 podman[252032]: 2025-10-10 10:23:11.227704997 +0000 UTC m=+0.089609084 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Oct 10 06:23:11 np0005479823 podman[252034]: 2025-10-10 10:23:11.238563623 +0000 UTC m=+0.095556813 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 06:23:11 np0005479823 podman[252033]: 2025-10-10 10:23:11.254570364 +0000 UTC m=+0.116266614 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct 10 06:23:11 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Oct 10 06:23:11 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/366803169' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Oct 10 06:23:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:23:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:11.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:23:11 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df"} v 0)
Oct 10 06:23:11 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3216819042' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Oct 10 06:23:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:12 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs dump"} v 0)
Oct 10 06:23:12 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/804085901' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Oct 10 06:23:12 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0)
Oct 10 06:23:12 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1570390407' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Oct 10 06:23:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:12.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:13 np0005479823 nova_compute[235775]: 2025-10-10 10:23:13.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:23:13 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mds stat"} v 0)
Oct 10 06:23:13 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/668363965' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Oct 10 06:23:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:23:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:13.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:23:13 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump"} v 0)
Oct 10 06:23:13 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3098976176' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Oct 10 06:23:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:23:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:23:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:23:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:14 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:23:14 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mds stat"} v 0)
Oct 10 06:23:14 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1056008335' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Oct 10 06:23:14 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls"} v 0)
Oct 10 06:23:14 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1135114031' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Oct 10 06:23:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:14.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:23:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:23:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:15.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:23:15 np0005479823 ovs-appctl[253164]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct 10 06:23:15 np0005479823 ovs-appctl[253173]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct 10 06:23:15 np0005479823 ovs-appctl[253179]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct 10 06:23:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd dump"} v 0)
Oct 10 06:23:15 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3141273853' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Oct 10 06:23:15 np0005479823 nova_compute[235775]: 2025-10-10 10:23:15.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:23:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:16 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd numa-status"} v 0)
Oct 10 06:23:16 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2028344608' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Oct 10 06:23:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:16.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:17 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail"} v 0)
Oct 10 06:23:17 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4187206927' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Oct 10 06:23:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:23:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:17.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:23:17 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd stat"} v 0)
Oct 10 06:23:17 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3498280741' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Oct 10 06:23:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:18 np0005479823 nova_compute[235775]: 2025-10-10 10:23:18.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:23:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:18.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:18 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Oct 10 06:23:18 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1213327011' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Oct 10 06:23:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:23:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:23:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:23:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:23:19 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "time-sync-status"} v 0)
Oct 10 06:23:19 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/436430043' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Oct 10 06:23:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:23:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:19.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:23:19 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump", "format": "json-pretty"} v 0)
Oct 10 06:23:19 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2054736604' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Oct 10 06:23:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:23:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "time-sync-status"} v 0)
Oct 10 06:23:20 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1781797121' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Oct 10 06:23:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail", "format": "json-pretty"} v 0)
Oct 10 06:23:20 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3032790570' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Oct 10 06:23:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:20.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:20 np0005479823 nova_compute[235775]: 2025-10-10 10:23:20.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:23:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:21 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json-pretty"} v 0)
Oct 10 06:23:21 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3519844817' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Oct 10 06:23:21 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs dump", "format": "json-pretty"} v 0)
Oct 10 06:23:21 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2014367048' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Oct 10 06:23:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:21.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:22 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mds stat", "format": "json-pretty"} v 0)
Oct 10 06:23:22 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/314814720' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Oct 10 06:23:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:22.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:22 np0005479823 podman[254950]: 2025-10-10 10:23:22.815515884 +0000 UTC m=+0.086386490 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Oct 10 06:23:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:23 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json-pretty"} v 0)
Oct 10 06:23:23 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/579627303' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Oct 10 06:23:23 np0005479823 nova_compute[235775]: 2025-10-10 10:23:23.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:23:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:23:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:23.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:23:23 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls", "format": "json-pretty"} v 0)
Oct 10 06:23:23 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1812088201' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Oct 10 06:23:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:23:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:23:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:23:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:23:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:24.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:24 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd dump", "format": "json-pretty"} v 0)
Oct 10 06:23:24 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1944349469' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Oct 10 06:23:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd numa-status", "format": "json-pretty"} v 0)
Oct 10 06:23:25 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3153766169' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Oct 10 06:23:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:23:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:23:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:25.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:23:25 np0005479823 nova_compute[235775]: 2025-10-10 10:23:25.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:23:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"} v 0)
Oct 10 06:23:26 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3944079894' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Oct 10 06:23:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd stat", "format": "json-pretty"} v 0)
Oct 10 06:23:26 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3285063725' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Oct 10 06:23:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:26.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:26 np0005479823 virtqemud[235088]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct 10 06:23:27 np0005479823 systemd[1]: Starting Time & Date Service...
Oct 10 06:23:27 np0005479823 systemd[1]: Started Time & Date Service.
Oct 10 06:23:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:27.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:27 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0)
Oct 10 06:23:27 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/742710764' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Oct 10 06:23:28 np0005479823 nova_compute[235775]: 2025-10-10 10:23:28.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:23:28 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "time-sync-status", "format": "json-pretty"} v 0)
Oct 10 06:23:28 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2080136583' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Oct 10 06:23:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:28.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:23:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:23:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:23:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:23:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:29.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:23:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:30.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:30 np0005479823 nova_compute[235775]: 2025-10-10 10:23:30.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:23:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:31.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:32.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:33 np0005479823 nova_compute[235775]: 2025-10-10 10:23:33.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:23:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:33.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:33 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 06:23:33 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:23:33 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:23:33 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 06:23:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:23:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:23:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:23:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:23:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:34.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:23:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:35.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:35 np0005479823 nova_compute[235775]: 2025-10-10 10:23:35.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:23:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:36.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:23:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:37.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:23:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:38 np0005479823 nova_compute[235775]: 2025-10-10 10:23:38.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:23:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:23:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:38.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:23:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:23:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:23:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:23:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:39 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:23:39 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:23:39 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:23:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:23:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:39.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:23:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:23:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:40.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:40 np0005479823 nova_compute[235775]: 2025-10-10 10:23:40.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:23:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:41 np0005479823 podman[255937]: 2025-10-10 10:23:41.431852413 +0000 UTC m=+0.059724908 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, managed_by=edpm_ansible, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 06:23:41 np0005479823 podman[255939]: 2025-10-10 10:23:41.455672094 +0000 UTC m=+0.077472636 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 10 06:23:41 np0005479823 podman[255938]: 2025-10-10 10:23:41.468776673 +0000 UTC m=+0.092441455 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 10 06:23:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:23:41.480 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:23:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:23:41.480 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:23:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:23:41.481 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:23:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:23:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:41.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:23:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:42.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:43 np0005479823 nova_compute[235775]: 2025-10-10 10:23:43.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:23:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:43.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:23:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:23:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:23:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:23:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:44.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:23:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:45.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:45 np0005479823 nova_compute[235775]: 2025-10-10 10:23:45.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:23:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:23:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:46.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:23:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:47.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:48 np0005479823 nova_compute[235775]: 2025-10-10 10:23:48.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:23:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:23:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:48.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:23:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:23:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:23:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:23:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:23:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:49.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:23:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:23:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:50.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:23:50 np0005479823 nova_compute[235775]: 2025-10-10 10:23:50.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:23:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:51.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:52.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:53 np0005479823 nova_compute[235775]: 2025-10-10 10:23:53.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:23:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:53.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:53 np0005479823 podman[256016]: 2025-10-10 10:23:53.782921834 +0000 UTC m=+0.056388151 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible)
Oct 10 06:23:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:23:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:23:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:23:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:23:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:23:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:54.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:23:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:23:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:55.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:55 np0005479823 nova_compute[235775]: 2025-10-10 10:23:55.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:23:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:56.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:57 np0005479823 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct 10 06:23:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:57.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:57 np0005479823 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 10 06:23:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:58 np0005479823 nova_compute[235775]: 2025-10-10 10:23:58.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:23:58 np0005479823 nova_compute[235775]: 2025-10-10 10:23:58.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:23:58 np0005479823 nova_compute[235775]: 2025-10-10 10:23:58.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:23:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:23:58.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:23:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:23:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:23:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:23:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:23:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:23:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:23:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:23:59.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:23:59 np0005479823 nova_compute[235775]: 2025-10-10 10:23:59.809 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:23:59 np0005479823 nova_compute[235775]: 2025-10-10 10:23:59.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:23:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:23:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:23:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:23:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:24:00 np0005479823 nova_compute[235775]: 2025-10-10 10:24:00.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:24:00 np0005479823 nova_compute[235775]: 2025-10-10 10:24:00.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct 10 06:24:00 np0005479823 nova_compute[235775]: 2025-10-10 10:24:00.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct 10 06:24:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:24:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:00.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:24:00 np0005479823 nova_compute[235775]: 2025-10-10 10:24:00.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:24:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:00 np0005479823 nova_compute[235775]: 2025-10-10 10:24:00.966 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct 10 06:24:00 np0005479823 nova_compute[235775]: 2025-10-10 10:24:00.967 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:24:00 np0005479823 nova_compute[235775]: 2025-10-10 10:24:00.990 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:24:00 np0005479823 nova_compute[235775]: 2025-10-10 10:24:00.991 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:24:00 np0005479823 nova_compute[235775]: 2025-10-10 10:24:00.991 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:24:00 np0005479823 nova_compute[235775]: 2025-10-10 10:24:00.991 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct 10 06:24:00 np0005479823 nova_compute[235775]: 2025-10-10 10:24:00.992 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:24:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:24:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:01.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:24:01 np0005479823 nova_compute[235775]: 2025-10-10 10:24:01.509 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:24:01 np0005479823 nova_compute[235775]: 2025-10-10 10:24:01.630 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct 10 06:24:01 np0005479823 nova_compute[235775]: 2025-10-10 10:24:01.631 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4720MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct 10 06:24:01 np0005479823 nova_compute[235775]: 2025-10-10 10:24:01.631 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:24:01 np0005479823 nova_compute[235775]: 2025-10-10 10:24:01.631 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:24:01 np0005479823 nova_compute[235775]: 2025-10-10 10:24:01.699 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct 10 06:24:01 np0005479823 nova_compute[235775]: 2025-10-10 10:24:01.700 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct 10 06:24:01 np0005479823 nova_compute[235775]: 2025-10-10 10:24:01.718 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:24:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:02 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:24:02 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3076253319' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:24:02 np0005479823 nova_compute[235775]: 2025-10-10 10:24:02.174 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:24:02 np0005479823 nova_compute[235775]: 2025-10-10 10:24:02.179 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct 10 06:24:02 np0005479823 nova_compute[235775]: 2025-10-10 10:24:02.196 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct 10 06:24:02 np0005479823 nova_compute[235775]: 2025-10-10 10:24:02.198 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct 10 06:24:02 np0005479823 nova_compute[235775]: 2025-10-10 10:24:02.198 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.567s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:24:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:02.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:03 np0005479823 nova_compute[235775]: 2025-10-10 10:24:03.046 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:24:03 np0005479823 nova_compute[235775]: 2025-10-10 10:24:03.047 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:24:03 np0005479823 nova_compute[235775]: 2025-10-10 10:24:03.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:24:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:24:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:03.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:24:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:24:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:24:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:24:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:24:04 np0005479823 nova_compute[235775]: 2025-10-10 10:24:04.810 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:24:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:04.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:04 np0005479823 nova_compute[235775]: 2025-10-10 10:24:04.835 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:24:04 np0005479823 nova_compute[235775]: 2025-10-10 10:24:04.836 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct 10 06:24:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:24:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:24:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:05.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:24:05 np0005479823 nova_compute[235775]: 2025-10-10 10:24:05.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:24:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:06.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:24:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:07.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:24:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:08 np0005479823 nova_compute[235775]: 2025-10-10 10:24:08.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:24:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:24:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:08.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:24:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:24:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:24:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:24:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:24:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:09.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:24:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:10.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:10 np0005479823 nova_compute[235775]: 2025-10-10 10:24:10.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:24:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:24:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:11.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:24:11 np0005479823 podman[256127]: 2025-10-10 10:24:11.712550595 +0000 UTC m=+0.058508320 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct 10 06:24:11 np0005479823 podman[256125]: 2025-10-10 10:24:11.717477932 +0000 UTC m=+0.067366513 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 06:24:11 np0005479823 podman[256126]: 2025-10-10 10:24:11.739598399 +0000 UTC m=+0.089486920 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible)
Oct 10 06:24:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:12.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:13 np0005479823 nova_compute[235775]: 2025-10-10 10:24:13.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:24:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:24:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:13.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:24:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:24:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:24:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:24:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:14 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:24:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:24:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:14.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:24:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:24:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:15.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:15 np0005479823 nova_compute[235775]: 2025-10-10 10:24:15.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:24:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:16.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:17.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:18 np0005479823 nova_compute[235775]: 2025-10-10 10:24:18.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:24:18 np0005479823 systemd-logind[796]: Session 57 logged out. Waiting for processes to exit.
Oct 10 06:24:18 np0005479823 systemd[1]: session-57.scope: Deactivated successfully.
Oct 10 06:24:18 np0005479823 systemd[1]: session-57.scope: Consumed 2min 45.594s CPU time, 723.5M memory peak, read 279.5M from disk, written 89.3M to disk.
Oct 10 06:24:18 np0005479823 systemd-logind[796]: Removed session 57.
Oct 10 06:24:18 np0005479823 systemd-logind[796]: New session 58 of user zuul.
Oct 10 06:24:18 np0005479823 systemd[1]: Started Session 58 of User zuul.
Oct 10 06:24:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:24:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:18.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:24:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:18 np0005479823 systemd[1]: session-58.scope: Deactivated successfully.
Oct 10 06:24:18 np0005479823 systemd-logind[796]: Session 58 logged out. Waiting for processes to exit.
Oct 10 06:24:18 np0005479823 systemd-logind[796]: Removed session 58.
Oct 10 06:24:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:24:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:24:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:24:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:24:19 np0005479823 systemd-logind[796]: New session 59 of user zuul.
Oct 10 06:24:19 np0005479823 systemd[1]: Started Session 59 of User zuul.
Oct 10 06:24:19 np0005479823 systemd[1]: session-59.scope: Deactivated successfully.
Oct 10 06:24:19 np0005479823 systemd-logind[796]: Session 59 logged out. Waiting for processes to exit.
Oct 10 06:24:19 np0005479823 systemd-logind[796]: Removed session 59.
Oct 10 06:24:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:19.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:24:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:20.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:20 np0005479823 nova_compute[235775]: 2025-10-10 10:24:20.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:24:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:21.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:22.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:23 np0005479823 nova_compute[235775]: 2025-10-10 10:24:23.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:24:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:24:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:23.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:24:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:24:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:24:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:24:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:24:24 np0005479823 podman[256287]: 2025-10-10 10:24:24.792681588 +0000 UTC m=+0.061637730 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3)
Oct 10 06:24:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:24.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:24:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:25.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:25 np0005479823 nova_compute[235775]: 2025-10-10 10:24:25.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:24:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Oct 10 06:24:26 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2262894800' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 06:24:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Oct 10 06:24:26 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2262894800' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 06:24:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:24:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:26.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:24:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:24:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:27.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:24:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:28 np0005479823 nova_compute[235775]: 2025-10-10 10:24:28.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:24:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:24:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:28.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:24:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:24:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:24:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:24:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:24:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:29.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:24:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:30.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:30 np0005479823 nova_compute[235775]: 2025-10-10 10:24:30.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:24:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:31.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:32.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:33 np0005479823 nova_compute[235775]: 2025-10-10 10:24:33.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:24:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:24:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:33.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:24:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:24:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:24:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:24:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:24:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:24:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:34.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:24:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:24:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:35.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:35 np0005479823 nova_compute[235775]: 2025-10-10 10:24:35.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:24:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:36.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:37.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:38 np0005479823 nova_compute[235775]: 2025-10-10 10:24:38.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:24:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:38.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:24:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:24:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:24:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:39 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:24:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:39.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:40 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:24:40 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:24:40 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:24:40 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:24:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:24:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:24:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:40.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:24:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:40 np0005479823 nova_compute[235775]: 2025-10-10 10:24:40.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:24:41 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 06:24:41 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:24:41 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:24:41 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 06:24:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:24:41.481 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:24:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:24:41.481 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:24:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:24:41.482 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:24:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:41.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:42 np0005479823 podman[256432]: 2025-10-10 10:24:42.774459894 +0000 UTC m=+0.046459765 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 10 06:24:42 np0005479823 podman[256430]: 2025-10-10 10:24:42.776608073 +0000 UTC m=+0.054497693 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 10 06:24:42 np0005479823 podman[256431]: 2025-10-10 10:24:42.802852922 +0000 UTC m=+0.078415166 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 06:24:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:42.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:43 np0005479823 nova_compute[235775]: 2025-10-10 10:24:43.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:24:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:43.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:24:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:24:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:24:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:24:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:44.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:45 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:24:45 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:24:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:24:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:45.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:45 np0005479823 nova_compute[235775]: 2025-10-10 10:24:45.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:24:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:46.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:24:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:47.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:24:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:48 np0005479823 nova_compute[235775]: 2025-10-10 10:24:48.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:24:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:48.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:24:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:24:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:24:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:24:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:49.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:24:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:50.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:50 np0005479823 nova_compute[235775]: 2025-10-10 10:24:50.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:24:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:51.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:52.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:53 np0005479823 nova_compute[235775]: 2025-10-10 10:24:53.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:24:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:53.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:24:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:24:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:24:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.201473) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091894201545, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 2430, "num_deletes": 508, "total_data_size": 5033625, "memory_usage": 5106832, "flush_reason": "Manual Compaction"}
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091894221085, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 3255631, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 34488, "largest_seqno": 36913, "table_properties": {"data_size": 3245062, "index_size": 5975, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3525, "raw_key_size": 29579, "raw_average_key_size": 21, "raw_value_size": 3220417, "raw_average_value_size": 2305, "num_data_blocks": 256, "num_entries": 1397, "num_filter_entries": 1397, "num_deletions": 508, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760091750, "oldest_key_time": 1760091750, "file_creation_time": 1760091894, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 19642 microseconds, and 6414 cpu microseconds.
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.221128) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 3255631 bytes OK
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.221150) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.222872) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.222886) EVENT_LOG_v1 {"time_micros": 1760091894222882, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.222904) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 5020871, prev total WAL file size 5020871, number of live WAL files 2.
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.224114) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032373631' seq:72057594037927935, type:22 .. '7061786F730033303133' seq:0, type:0; will stop at (end)
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(3179KB)], [66(13MB)]
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091894224165, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 17070389, "oldest_snapshot_seqno": -1}
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 6497 keys, 14858799 bytes, temperature: kUnknown
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091894304586, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 14858799, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14814842, "index_size": 26631, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16261, "raw_key_size": 170803, "raw_average_key_size": 26, "raw_value_size": 14697136, "raw_average_value_size": 2262, "num_data_blocks": 1053, "num_entries": 6497, "num_filter_entries": 6497, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760091894, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.304900) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 14858799 bytes
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.306299) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 212.0 rd, 184.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.1, 13.2 +0.0 blob) out(14.2 +0.0 blob), read-write-amplify(9.8) write-amplify(4.6) OK, records in: 7530, records dropped: 1033 output_compression: NoCompression
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.306325) EVENT_LOG_v1 {"time_micros": 1760091894306313, "job": 40, "event": "compaction_finished", "compaction_time_micros": 80507, "compaction_time_cpu_micros": 26858, "output_level": 6, "num_output_files": 1, "total_output_size": 14858799, "num_input_records": 7530, "num_output_records": 6497, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091894307348, "job": 40, "event": "table_file_deletion", "file_number": 68}
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091894310714, "job": 40, "event": "table_file_deletion", "file_number": 66}
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.224006) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.310753) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.310759) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.310761) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.310763) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:24:54 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:24:54.310765) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:24:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:54.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:24:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:55.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:55 np0005479823 podman[256528]: 2025-10-10 10:24:55.785787318 +0000 UTC m=+0.058092496 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Oct 10 06:24:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:55 np0005479823 nova_compute[235775]: 2025-10-10 10:24:55.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:24:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:56.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:57.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:58 np0005479823 nova_compute[235775]: 2025-10-10 10:24:58.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:24:58 np0005479823 nova_compute[235775]: 2025-10-10 10:24:58.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:24:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:24:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:24:58.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:24:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:24:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:24:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:24:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:24:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:24:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:24:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:24:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:24:59.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:24:59 np0005479823 nova_compute[235775]: 2025-10-10 10:24:59.817 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:24:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:24:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:24:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:24:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:25:00 np0005479823 nova_compute[235775]: 2025-10-10 10:25:00.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:25:00 np0005479823 nova_compute[235775]: 2025-10-10 10:25:00.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct 10 06:25:00 np0005479823 nova_compute[235775]: 2025-10-10 10:25:00.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct 10 06:25:00 np0005479823 nova_compute[235775]: 2025-10-10 10:25:00.834 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct 10 06:25:00 np0005479823 nova_compute[235775]: 2025-10-10 10:25:00.834 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:25:00 np0005479823 nova_compute[235775]: 2025-10-10 10:25:00.834 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:25:00 np0005479823 nova_compute[235775]: 2025-10-10 10:25:00.859 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:25:00 np0005479823 nova_compute[235775]: 2025-10-10 10:25:00.859 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:25:00 np0005479823 nova_compute[235775]: 2025-10-10 10:25:00.860 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:25:00 np0005479823 nova_compute[235775]: 2025-10-10 10:25:00.860 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct 10 06:25:00 np0005479823 nova_compute[235775]: 2025-10-10 10:25:00.860 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:25:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:00.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:00 np0005479823 nova_compute[235775]: 2025-10-10 10:25:00.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:25:01 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:25:01 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3269636147' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:25:01 np0005479823 nova_compute[235775]: 2025-10-10 10:25:01.304 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:25:01 np0005479823 nova_compute[235775]: 2025-10-10 10:25:01.489 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct 10 06:25:01 np0005479823 nova_compute[235775]: 2025-10-10 10:25:01.490 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4826MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct 10 06:25:01 np0005479823 nova_compute[235775]: 2025-10-10 10:25:01.491 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:25:01 np0005479823 nova_compute[235775]: 2025-10-10 10:25:01.491 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:25:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:25:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:01.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:25:01 np0005479823 nova_compute[235775]: 2025-10-10 10:25:01.573 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct 10 06:25:01 np0005479823 nova_compute[235775]: 2025-10-10 10:25:01.573 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct 10 06:25:01 np0005479823 nova_compute[235775]: 2025-10-10 10:25:01.590 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:25:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:02 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:25:02 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1111097345' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:25:02 np0005479823 nova_compute[235775]: 2025-10-10 10:25:02.036 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:25:02 np0005479823 nova_compute[235775]: 2025-10-10 10:25:02.041 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct 10 06:25:02 np0005479823 nova_compute[235775]: 2025-10-10 10:25:02.062 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct 10 06:25:02 np0005479823 nova_compute[235775]: 2025-10-10 10:25:02.064 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct 10 06:25:02 np0005479823 nova_compute[235775]: 2025-10-10 10:25:02.065 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:25:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:02.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:03 np0005479823 nova_compute[235775]: 2025-10-10 10:25:03.060 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:25:03 np0005479823 nova_compute[235775]: 2025-10-10 10:25:03.060 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:25:03 np0005479823 nova_compute[235775]: 2025-10-10 10:25:03.060 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:25:03 np0005479823 nova_compute[235775]: 2025-10-10 10:25:03.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:25:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:25:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:03.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:25:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:25:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:25:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:25:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:25:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:04.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:25:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:25:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:05.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:25:05 np0005479823 nova_compute[235775]: 2025-10-10 10:25:05.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:25:05 np0005479823 nova_compute[235775]: 2025-10-10 10:25:05.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct 10 06:25:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:05 np0005479823 nova_compute[235775]: 2025-10-10 10:25:05.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:25:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:06.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:07.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:08 np0005479823 nova_compute[235775]: 2025-10-10 10:25:08.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:25:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:08.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:25:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:25:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:25:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:25:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:09.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:25:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:10.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:10 np0005479823 nova_compute[235775]: 2025-10-10 10:25:10.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:25:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:11.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:12.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:13 np0005479823 nova_compute[235775]: 2025-10-10 10:25:13.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:25:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:25:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:13.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:25:13 np0005479823 podman[256637]: 2025-10-10 10:25:13.806586663 +0000 UTC m=+0.067245670 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 06:25:13 np0005479823 podman[256635]: 2025-10-10 10:25:13.816914352 +0000 UTC m=+0.093406844 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Oct 10 06:25:13 np0005479823 podman[256636]: 2025-10-10 10:25:13.824886318 +0000 UTC m=+0.098867160 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 10 06:25:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:25:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:25:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:25:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:14 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:25:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:14.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:25:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:25:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:15.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:25:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:15 np0005479823 nova_compute[235775]: 2025-10-10 10:25:15.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:25:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:16.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:25:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:17.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:25:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:18 np0005479823 nova_compute[235775]: 2025-10-10 10:25:18.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:25:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:18.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:25:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:25:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:25:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:25:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:25:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:19.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:25:19 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 06:25:19 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.0 total, 600.0 interval#012Cumulative writes: 7071 writes, 37K keys, 7071 commit groups, 1.0 writes per commit group, ingest: 0.09 GB, 0.04 MB/s#012Cumulative WAL: 7071 writes, 7071 syncs, 1.00 writes per sync, written: 0.09 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1582 writes, 8385 keys, 1582 commit groups, 1.0 writes per commit group, ingest: 17.92 MB, 0.03 MB/s#012Interval WAL: 1582 writes, 1582 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0    136.7      0.40              0.16        20    0.020       0      0       0.0       0.0#012  L6      1/0   14.17 MB   0.0      0.3     0.1      0.2       0.2      0.0       0.0   4.5    182.1    156.3      1.57              0.64        19    0.082    107K    10K       0.0       0.0#012 Sum      1/0   14.17 MB   0.0      0.3     0.1      0.2       0.3      0.1       0.0   5.5    145.1    152.4      1.96              0.80        39    0.050    107K    10K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.0    169.4    170.5      0.47              0.22        10    0.047     34K   3591       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.3     0.1      0.2       0.2      0.0       0.0   0.0    182.1    156.3      1.57              0.64        19    0.082    107K    10K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0    137.3      0.40              0.16        19    0.021       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 2400.0 total, 600.0 interval#012Flush(GB): cumulative 0.053, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.29 GB write, 0.12 MB/s write, 0.28 GB read, 0.12 MB/s read, 2.0 seconds#012Interval compaction: 0.08 GB write, 0.13 MB/s write, 0.08 GB read, 0.13 MB/s read, 0.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x56161a963350#2 capacity: 304.00 MB usage: 26.83 MB table_size: 0 occupancy: 18446744073709551615 collections: 5 last_copies: 0 last_secs: 0.000232 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1610,26.00 MB,8.55389%) FilterBlock(39,311.17 KB,0.0999601%) IndexBlock(39,534.27 KB,0.171626%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Oct 10 06:25:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:25:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:25:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:20.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:25:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:20 np0005479823 nova_compute[235775]: 2025-10-10 10:25:20.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:25:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:25:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:21.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:25:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:25:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:22.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:25:23 np0005479823 nova_compute[235775]: 2025-10-10 10:25:23.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:25:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:23.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:25:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:25:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:25:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:25:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:24.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:25:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:25.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:25 np0005479823 nova_compute[235775]: 2025-10-10 10:25:25.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:25:26 np0005479823 podman[256736]: 2025-10-10 10:25:26.78075978 +0000 UTC m=+0.053796380 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true)
Oct 10 06:25:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:26.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:27.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:28 np0005479823 nova_compute[235775]: 2025-10-10 10:25:28.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:25:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:28.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:25:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:25:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:25:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:25:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:29.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:25:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:25:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:30.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:25:30 np0005479823 nova_compute[235775]: 2025-10-10 10:25:30.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:25:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:31.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:25:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:32.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:25:33 np0005479823 nova_compute[235775]: 2025-10-10 10:25:33.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:25:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:33.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:25:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:25:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:25:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:25:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:25:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:34.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:25:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-crash-compute-2[75519]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Oct 10 06:25:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:25:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:25:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:35.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:25:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:36 np0005479823 nova_compute[235775]: 2025-10-10 10:25:36.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:25:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:36.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:25:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:37.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:25:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:38 np0005479823 nova_compute[235775]: 2025-10-10 10:25:38.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:25:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:38.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:25:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:25:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:25:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:39 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:25:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:39.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:25:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:40.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:41 np0005479823 nova_compute[235775]: 2025-10-10 10:25:41.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:25:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:25:41.482 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:25:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:25:41.483 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:25:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:25:41.483 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:25:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:25:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:41.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:25:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:42.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:43 np0005479823 nova_compute[235775]: 2025-10-10 10:25:43.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:25:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:43.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:25:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:25:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:25:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:25:44 np0005479823 podman[256802]: 2025-10-10 10:25:44.801635056 +0000 UTC m=+0.065888967 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, tcib_managed=true, managed_by=edpm_ansible)
Oct 10 06:25:44 np0005479823 podman[256800]: 2025-10-10 10:25:44.809535028 +0000 UTC m=+0.074770309 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct 10 06:25:44 np0005479823 podman[256801]: 2025-10-10 10:25:44.827985737 +0000 UTC m=+0.096096310 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct 10 06:25:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:44.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:45 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:25:45 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:25:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:25:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:45.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:46 np0005479823 nova_compute[235775]: 2025-10-10 10:25:46.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:25:46 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 06:25:46 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:25:46 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:25:46 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 06:25:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:46.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:47.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:48 np0005479823 nova_compute[235775]: 2025-10-10 10:25:48.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:25:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:48.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:25:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:25:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:25:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:25:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:49.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:49 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 06:25:49 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.0 total, 600.0 interval#012Cumulative writes: 10K writes, 42K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.02 MB/s#012Cumulative WAL: 10K writes, 3043 syncs, 3.52 writes per sync, written: 0.04 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2448 writes, 8982 keys, 2448 commit groups, 1.0 writes per commit group, ingest: 9.37 MB, 0.02 MB/s#012Interval WAL: 2448 writes, 1024 syncs, 2.39 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Oct 10 06:25:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:25:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:50.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:51 np0005479823 nova_compute[235775]: 2025-10-10 10:25:51.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:25:51 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:25:51 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:25:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:51.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:52.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:53 np0005479823 nova_compute[235775]: 2025-10-10 10:25:53.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:25:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:53.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:25:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:25:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:25:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:25:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:54.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:25:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:55.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:56 np0005479823 nova_compute[235775]: 2025-10-10 10:25:56.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:25:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:56.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:57.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:57 np0005479823 podman[257056]: 2025-10-10 10:25:57.778727486 +0000 UTC m=+0.058829060 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct 10 06:25:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:58 np0005479823 nova_compute[235775]: 2025-10-10 10:25:58.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:25:58 np0005479823 nova_compute[235775]: 2025-10-10 10:25:58.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:25:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:25:58.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:25:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:25:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:25:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:25:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:25:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:25:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:25:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:25:59.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:25:59 np0005479823 nova_compute[235775]: 2025-10-10 10:25:59.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:25:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:25:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:25:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:25:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:26:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:00 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:00 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:26:00 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:00.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:26:01 np0005479823 nova_compute[235775]: 2025-10-10 10:26:01.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:26:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:26:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:01.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:26:01 np0005479823 nova_compute[235775]: 2025-10-10 10:26:01.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:26:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:02 np0005479823 nova_compute[235775]: 2025-10-10 10:26:02.809 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:26:02 np0005479823 nova_compute[235775]: 2025-10-10 10:26:02.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:26:02 np0005479823 nova_compute[235775]: 2025-10-10 10:26:02.813 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct 10 06:26:02 np0005479823 nova_compute[235775]: 2025-10-10 10:26:02.813 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct 10 06:26:02 np0005479823 nova_compute[235775]: 2025-10-10 10:26:02.837 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct 10 06:26:02 np0005479823 nova_compute[235775]: 2025-10-10 10:26:02.837 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:26:02 np0005479823 nova_compute[235775]: 2025-10-10 10:26:02.838 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:26:02 np0005479823 nova_compute[235775]: 2025-10-10 10:26:02.866 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:26:02 np0005479823 nova_compute[235775]: 2025-10-10 10:26:02.866 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:26:02 np0005479823 nova_compute[235775]: 2025-10-10 10:26:02.866 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:26:02 np0005479823 nova_compute[235775]: 2025-10-10 10:26:02.866 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct 10 06:26:02 np0005479823 nova_compute[235775]: 2025-10-10 10:26:02.867 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:26:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:02 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:02.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:03 np0005479823 nova_compute[235775]: 2025-10-10 10:26:03.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:26:03 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:26:03 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/212724700' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:26:03 np0005479823 nova_compute[235775]: 2025-10-10 10:26:03.284 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:26:03 np0005479823 nova_compute[235775]: 2025-10-10 10:26:03.428 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct 10 06:26:03 np0005479823 nova_compute[235775]: 2025-10-10 10:26:03.429 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4837MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct 10 06:26:03 np0005479823 nova_compute[235775]: 2025-10-10 10:26:03.429 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:26:03 np0005479823 nova_compute[235775]: 2025-10-10 10:26:03.430 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:26:03 np0005479823 nova_compute[235775]: 2025-10-10 10:26:03.521 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct 10 06:26:03 np0005479823 nova_compute[235775]: 2025-10-10 10:26:03.522 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct 10 06:26:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:03.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:03 np0005479823 nova_compute[235775]: 2025-10-10 10:26:03.691 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Refreshing inventories for resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct 10 06:26:03 np0005479823 nova_compute[235775]: 2025-10-10 10:26:03.796 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Updating ProviderTree inventory for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct 10 06:26:03 np0005479823 nova_compute[235775]: 2025-10-10 10:26:03.797 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Updating inventory in ProviderTree for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct 10 06:26:03 np0005479823 nova_compute[235775]: 2025-10-10 10:26:03.809 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Refreshing aggregate associations for resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct 10 06:26:03 np0005479823 nova_compute[235775]: 2025-10-10 10:26:03.835 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Refreshing trait associations for resource provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0, traits: HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_AVX2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_FMA3,HW_CPU_X86_SVM,HW_CPU_X86_F16C,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_CLMUL,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_BMI2,HW_CPU_X86_MMX,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct 10 06:26:03 np0005479823 nova_compute[235775]: 2025-10-10 10:26:03.851 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:26:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:26:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:26:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:26:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:26:04 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:26:04 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/470207897' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:26:04 np0005479823 nova_compute[235775]: 2025-10-10 10:26:04.314 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:26:04 np0005479823 nova_compute[235775]: 2025-10-10 10:26:04.321 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct 10 06:26:04 np0005479823 nova_compute[235775]: 2025-10-10 10:26:04.345 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct 10 06:26:04 np0005479823 nova_compute[235775]: 2025-10-10 10:26:04.347 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct 10 06:26:04 np0005479823 nova_compute[235775]: 2025-10-10 10:26:04.347 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.918s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:26:04 np0005479823 nova_compute[235775]: 2025-10-10 10:26:04.348 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:26:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:04.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:05 np0005479823 nova_compute[235775]: 2025-10-10 10:26:05.342 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:26:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:26:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:26:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:05.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:26:05 np0005479823 nova_compute[235775]: 2025-10-10 10:26:05.810 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:26:05 np0005479823 nova_compute[235775]: 2025-10-10 10:26:05.829 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:26:05 np0005479823 nova_compute[235775]: 2025-10-10 10:26:05.830 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct 10 06:26:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:06 np0005479823 nova_compute[235775]: 2025-10-10 10:26:06.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:26:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:26:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:06.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:26:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:07.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:08 np0005479823 nova_compute[235775]: 2025-10-10 10:26:08.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:26:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:08.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:26:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:26:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:26:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:26:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:26:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:09.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:26:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:26:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:10.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:11 np0005479823 nova_compute[235775]: 2025-10-10 10:26:11.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:26:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:26:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:11.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:26:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:26:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:12.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:26:13 np0005479823 nova_compute[235775]: 2025-10-10 10:26:13.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:26:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:13.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:13 np0005479823 nova_compute[235775]: 2025-10-10 10:26:13.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:26:13 np0005479823 nova_compute[235775]: 2025-10-10 10:26:13.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct 10 06:26:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:26:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:26:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:26:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:14 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:26:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:14.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:26:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:15.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:15 np0005479823 podman[257166]: 2025-10-10 10:26:15.789345675 +0000 UTC m=+0.058298254 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct 10 06:26:15 np0005479823 podman[257164]: 2025-10-10 10:26:15.79796048 +0000 UTC m=+0.071959510 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 10 06:26:15 np0005479823 podman[257165]: 2025-10-10 10:26:15.823510676 +0000 UTC m=+0.095378668 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct 10 06:26:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:16 np0005479823 nova_compute[235775]: 2025-10-10 10:26:16.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:26:16 np0005479823 nova_compute[235775]: 2025-10-10 10:26:16.838 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:26:16 np0005479823 nova_compute[235775]: 2025-10-10 10:26:16.839 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct 10 06:26:16 np0005479823 nova_compute[235775]: 2025-10-10 10:26:16.878 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct 10 06:26:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:16.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:26:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:17.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:26:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:18 np0005479823 nova_compute[235775]: 2025-10-10 10:26:18.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:26:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:26:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:18.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:26:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:26:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:26:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:26:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:26:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:26:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:19.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:26:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:26:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:26:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:20.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:26:21 np0005479823 nova_compute[235775]: 2025-10-10 10:26:21.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:26:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:26:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:21.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:26:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:22.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:23 np0005479823 nova_compute[235775]: 2025-10-10 10:26:23.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:26:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:26:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:23.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:26:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:26:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:26:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:26:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.593072) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091984593101, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 1140, "num_deletes": 251, "total_data_size": 2603785, "memory_usage": 2650792, "flush_reason": "Manual Compaction"}
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091984603375, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 1080048, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36919, "largest_seqno": 38053, "table_properties": {"data_size": 1076005, "index_size": 1631, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 10781, "raw_average_key_size": 20, "raw_value_size": 1067267, "raw_average_value_size": 2068, "num_data_blocks": 70, "num_entries": 516, "num_filter_entries": 516, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760091894, "oldest_key_time": 1760091894, "file_creation_time": 1760091984, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 10332 microseconds, and 3214 cpu microseconds.
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.603405) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 1080048 bytes OK
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.603419) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.605033) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.605043) EVENT_LOG_v1 {"time_micros": 1760091984605039, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.605057) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 2598290, prev total WAL file size 2598290, number of live WAL files 2.
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.605690) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031303033' seq:72057594037927935, type:22 .. '6D6772737461740031323535' seq:0, type:0; will stop at (end)
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(1054KB)], [69(14MB)]
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091984605715, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 15938847, "oldest_snapshot_seqno": -1}
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 6534 keys, 12459862 bytes, temperature: kUnknown
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091984649814, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 12459862, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12419315, "index_size": 23091, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16389, "raw_key_size": 171738, "raw_average_key_size": 26, "raw_value_size": 12304663, "raw_average_value_size": 1883, "num_data_blocks": 907, "num_entries": 6534, "num_filter_entries": 6534, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760091984, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.650154) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 12459862 bytes
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.651399) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 360.4 rd, 281.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 14.2 +0.0 blob) out(11.9 +0.0 blob), read-write-amplify(26.3) write-amplify(11.5) OK, records in: 7013, records dropped: 479 output_compression: NoCompression
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.651418) EVENT_LOG_v1 {"time_micros": 1760091984651409, "job": 42, "event": "compaction_finished", "compaction_time_micros": 44222, "compaction_time_cpu_micros": 24421, "output_level": 6, "num_output_files": 1, "total_output_size": 12459862, "num_input_records": 7013, "num_output_records": 6534, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091984651711, "job": 42, "event": "table_file_deletion", "file_number": 71}
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760091984654365, "job": 42, "event": "table_file_deletion", "file_number": 69}
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.605631) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.654460) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.654467) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.654470) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.654473) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:26:24 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:26:24.654476) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:26:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:24.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:26:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:26:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:25.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:26:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:26 np0005479823 nova_compute[235775]: 2025-10-10 10:26:26.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:26:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:26.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:26:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:27.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:26:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:28 np0005479823 nova_compute[235775]: 2025-10-10 10:26:28.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:26:28 np0005479823 podman[257265]: 2025-10-10 10:26:28.781008472 +0000 UTC m=+0.054524473 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct 10 06:26:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:28.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:26:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:26:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:26:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:26:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:26:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:29.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:26:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:26:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:30.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:31 np0005479823 nova_compute[235775]: 2025-10-10 10:26:31.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:26:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:31.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:32.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:33 np0005479823 nova_compute[235775]: 2025-10-10 10:26:33.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:26:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:33.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:26:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:26:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:26:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:26:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:26:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:34.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:26:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:26:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:35.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:36 np0005479823 nova_compute[235775]: 2025-10-10 10:26:36.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:26:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:36.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:37.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:38 np0005479823 nova_compute[235775]: 2025-10-10 10:26:38.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:26:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:38 np0005479823 nova_compute[235775]: 2025-10-10 10:26:38.944 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:26:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:38.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:26:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:26:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:26:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:39 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:26:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:26:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:39.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:26:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:26:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:40.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:41 np0005479823 nova_compute[235775]: 2025-10-10 10:26:41.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:26:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:26:41.483 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:26:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:26:41.483 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:26:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:26:41.484 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:26:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:26:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:41.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:26:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:42.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:43 np0005479823 nova_compute[235775]: 2025-10-10 10:26:43.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:26:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:43.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:26:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:26:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:26:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:26:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:44.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:26:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:45.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:46 np0005479823 nova_compute[235775]: 2025-10-10 10:26:46.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:26:46 np0005479823 podman[257328]: 2025-10-10 10:26:46.778729035 +0000 UTC m=+0.058407325 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct 10 06:26:46 np0005479823 podman[257330]: 2025-10-10 10:26:46.79080589 +0000 UTC m=+0.062040521 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 10 06:26:46 np0005479823 podman[257329]: 2025-10-10 10:26:46.805123347 +0000 UTC m=+0.081751650 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct 10 06:26:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:46.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:26:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:47.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:26:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:48 np0005479823 nova_compute[235775]: 2025-10-10 10:26:48.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:26:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:26:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:26:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:26:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:26:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:49.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:49.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:26:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:51.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:51 np0005479823 nova_compute[235775]: 2025-10-10 10:26:51.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:26:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:51.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:52 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 06:26:52 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:26:52 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:26:52 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 06:26:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:53.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:53 np0005479823 nova_compute[235775]: 2025-10-10 10:26:53.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:26:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:53.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:26:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:26:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:26:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:26:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:55.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:26:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:55.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:56 np0005479823 nova_compute[235775]: 2025-10-10 10:26:56.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:26:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:57.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:57 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:26:57 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:26:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:57.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:58 np0005479823 nova_compute[235775]: 2025-10-10 10:26:58.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:26:58 np0005479823 nova_compute[235775]: 2025-10-10 10:26:58.832 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:26:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:26:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:26:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:26:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:26:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:26:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:26:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:26:59.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:26:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:26:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:26:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:26:59.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:26:59 np0005479823 podman[257513]: 2025-10-10 10:26:59.823685777 +0000 UTC m=+0.088297450 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 10 06:26:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:26:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:26:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:26:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:27:00 np0005479823 nova_compute[235775]: 2025-10-10 10:27:00.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:27:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:27:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:01.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:27:01 np0005479823 nova_compute[235775]: 2025-10-10 10:27:01.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:27:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:27:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:01.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:27:01 np0005479823 nova_compute[235775]: 2025-10-10 10:27:01.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:27:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:03.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:03 np0005479823 nova_compute[235775]: 2025-10-10 10:27:03.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:27:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:03.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:03 np0005479823 nova_compute[235775]: 2025-10-10 10:27:03.810 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:27:03 np0005479823 nova_compute[235775]: 2025-10-10 10:27:03.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:27:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:27:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:27:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:27:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:27:04 np0005479823 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Oct 10 06:27:04 np0005479823 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Oct 10 06:27:04 np0005479823 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Oct 10 06:27:04 np0005479823 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Oct 10 06:27:04 np0005479823 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Oct 10 06:27:04 np0005479823 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Oct 10 06:27:04 np0005479823 radosgw[83867]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Oct 10 06:27:04 np0005479823 nova_compute[235775]: 2025-10-10 10:27:04.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:27:04 np0005479823 nova_compute[235775]: 2025-10-10 10:27:04.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct 10 06:27:04 np0005479823 nova_compute[235775]: 2025-10-10 10:27:04.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct 10 06:27:04 np0005479823 nova_compute[235775]: 2025-10-10 10:27:04.838 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct 10 06:27:04 np0005479823 nova_compute[235775]: 2025-10-10 10:27:04.839 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:27:04 np0005479823 nova_compute[235775]: 2025-10-10 10:27:04.839 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:27:04 np0005479823 nova_compute[235775]: 2025-10-10 10:27:04.878 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:27:04 np0005479823 nova_compute[235775]: 2025-10-10 10:27:04.879 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:27:04 np0005479823 nova_compute[235775]: 2025-10-10 10:27:04.879 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:27:04 np0005479823 nova_compute[235775]: 2025-10-10 10:27:04.880 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct 10 06:27:04 np0005479823 nova_compute[235775]: 2025-10-10 10:27:04.880 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:27:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:05.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:27:05 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/821806748' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:27:05 np0005479823 nova_compute[235775]: 2025-10-10 10:27:05.332 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:27:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:27:05 np0005479823 nova_compute[235775]: 2025-10-10 10:27:05.481 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct 10 06:27:05 np0005479823 nova_compute[235775]: 2025-10-10 10:27:05.482 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4844MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct 10 06:27:05 np0005479823 nova_compute[235775]: 2025-10-10 10:27:05.482 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:27:05 np0005479823 nova_compute[235775]: 2025-10-10 10:27:05.482 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:27:05 np0005479823 nova_compute[235775]: 2025-10-10 10:27:05.551 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct 10 06:27:05 np0005479823 nova_compute[235775]: 2025-10-10 10:27:05.551 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct 10 06:27:05 np0005479823 nova_compute[235775]: 2025-10-10 10:27:05.586 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:27:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:05.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:06 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:27:06 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2564603434' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:27:06 np0005479823 nova_compute[235775]: 2025-10-10 10:27:06.085 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:27:06 np0005479823 nova_compute[235775]: 2025-10-10 10:27:06.091 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct 10 06:27:06 np0005479823 nova_compute[235775]: 2025-10-10 10:27:06.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:27:06 np0005479823 nova_compute[235775]: 2025-10-10 10:27:06.115 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct 10 06:27:06 np0005479823 nova_compute[235775]: 2025-10-10 10:27:06.117 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct 10 06:27:06 np0005479823 nova_compute[235775]: 2025-10-10 10:27:06.117 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.635s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:27:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:27:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:07.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:27:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:07.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:08 np0005479823 nova_compute[235775]: 2025-10-10 10:27:08.093 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:27:08 np0005479823 nova_compute[235775]: 2025-10-10 10:27:08.094 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct 10 06:27:08 np0005479823 nova_compute[235775]: 2025-10-10 10:27:08.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:27:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:27:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:27:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:27:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:27:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:09.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:09 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:27:09 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3915087126' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:27:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:27:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:09.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:27:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:27:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:11.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:11 np0005479823 nova_compute[235775]: 2025-10-10 10:27:11.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:27:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:27:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:11.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:27:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:13.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:13 np0005479823 nova_compute[235775]: 2025-10-10 10:27:13.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:27:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:13.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:27:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:27:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:27:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:14 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:27:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:15.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:27:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:15.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:16 np0005479823 nova_compute[235775]: 2025-10-10 10:27:16.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:27:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:17.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:17.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:17 np0005479823 podman[257620]: 2025-10-10 10:27:17.815855029 +0000 UTC m=+0.082064711 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3)
Oct 10 06:27:17 np0005479823 podman[257624]: 2025-10-10 10:27:17.848070588 +0000 UTC m=+0.093875879 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 10 06:27:17 np0005479823 podman[257621]: 2025-10-10 10:27:17.850983911 +0000 UTC m=+0.112478773 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct 10 06:27:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:18 np0005479823 nova_compute[235775]: 2025-10-10 10:27:18.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:27:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:27:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:27:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:27:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:27:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:27:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:19.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:27:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:19.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:27:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:21.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:21 np0005479823 nova_compute[235775]: 2025-10-10 10:27:21.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:27:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:27:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:21.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:27:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:27:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:23.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:27:23 np0005479823 nova_compute[235775]: 2025-10-10 10:27:23.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:27:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:23.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:27:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:27:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:27:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:27:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:25.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:27:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:25.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:26 np0005479823 nova_compute[235775]: 2025-10-10 10:27:26.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:27:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Oct 10 06:27:26 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4007566910' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 06:27:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Oct 10 06:27:26 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4007566910' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 06:27:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:27:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:27.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:27:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:27.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:28 np0005479823 nova_compute[235775]: 2025-10-10 10:27:28.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:27:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:27:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:27:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:27:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:27:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:29.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:29.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:27:30 np0005479823 nova_compute[235775]: 2025-10-10 10:27:30.466 2 DEBUG oslo_concurrency.processutils [None req-5428eec2-0e0c-4df7-adf7-b6b22d8050c9 e1aed125091e48e09d5990f110c14c39 ec962e275689437d80680ff3ea69c852 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:27:30 np0005479823 nova_compute[235775]: 2025-10-10 10:27:30.502 2 DEBUG oslo_concurrency.processutils [None req-5428eec2-0e0c-4df7-adf7-b6b22d8050c9 e1aed125091e48e09d5990f110c14c39 ec962e275689437d80680ff3ea69c852 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:27:30 np0005479823 podman[257721]: 2025-10-10 10:27:30.775619942 +0000 UTC m=+0.053569842 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct 10 06:27:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:27:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:31.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:27:31 np0005479823 nova_compute[235775]: 2025-10-10 10:27:31.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:27:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:31.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:27:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:33.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:27:33 np0005479823 nova_compute[235775]: 2025-10-10 10:27:33.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:27:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:33.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:27:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:27:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:27:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:27:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:27:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:35.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:27:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:27:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:35.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:36 np0005479823 nova_compute[235775]: 2025-10-10 10:27:36.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:27:36 np0005479823 nova_compute[235775]: 2025-10-10 10:27:36.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:27:36 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:27:36.324 141795 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'da:dc:6a', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '42:2f:dd:4e:d8:41'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct 10 06:27:36 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:27:36.327 141795 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct 10 06:27:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:37.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:37.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:38 np0005479823 nova_compute[235775]: 2025-10-10 10:27:38.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:27:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:27:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:27:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:27:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:39 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:27:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:27:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:39.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:27:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:27:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:39.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:27:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:27:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:27:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:41.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:27:41 np0005479823 nova_compute[235775]: 2025-10-10 10:27:41.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:27:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:27:41.329 141795 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=49146ebb-575d-4bd4-816c-0b242fb944ee, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct 10 06:27:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:27:41.484 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:27:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:27:41.485 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:27:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:27:41.485 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:27:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:41.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:27:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:43.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:27:43 np0005479823 nova_compute[235775]: 2025-10-10 10:27:43.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:27:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:43.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:27:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:27:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:27:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:27:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:45.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:27:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:45.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:46 np0005479823 nova_compute[235775]: 2025-10-10 10:27:46.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:27:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:27:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:47.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:27:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:47.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:48 np0005479823 nova_compute[235775]: 2025-10-10 10:27:48.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:27:48 np0005479823 podman[257785]: 2025-10-10 10:27:48.82899107 +0000 UTC m=+0.095317765 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct 10 06:27:48 np0005479823 podman[257786]: 2025-10-10 10:27:48.854728141 +0000 UTC m=+0.116329385 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct 10 06:27:48 np0005479823 podman[257787]: 2025-10-10 10:27:48.861007221 +0000 UTC m=+0.116094027 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 10 06:27:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:27:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:27:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:27:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:27:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:27:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:49.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:27:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:49.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:27:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:27:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:51.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:27:51 np0005479823 nova_compute[235775]: 2025-10-10 10:27:51.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:27:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:27:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:51.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:27:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:53.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:53 np0005479823 nova_compute[235775]: 2025-10-10 10:27:53.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:27:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:53.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:27:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:27:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:27:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:27:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:27:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:55.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:27:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:27:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:55.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:56 np0005479823 nova_compute[235775]: 2025-10-10 10:27:56.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:27:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:27:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:57.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:27:57 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 06:27:57 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:27:57 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:27:57 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 06:27:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:57.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:58 np0005479823 nova_compute[235775]: 2025-10-10 10:27:58.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:27:58 np0005479823 nova_compute[235775]: 2025-10-10 10:27:58.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:27:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:27:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:27:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:27:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:27:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:27:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:27:59.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:27:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:27:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:27:59.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:27:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:27:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:27:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:27:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:28:00 np0005479823 nova_compute[235775]: 2025-10-10 10:28:00.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:28:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:01.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:01 np0005479823 nova_compute[235775]: 2025-10-10 10:28:01.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:28:01 np0005479823 podman[257944]: 2025-10-10 10:28:01.811800128 +0000 UTC m=+0.078622902 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct 10 06:28:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:01.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:02 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:28:02 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:28:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:28:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:03.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:28:03 np0005479823 nova_compute[235775]: 2025-10-10 10:28:03.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:28:03 np0005479823 nova_compute[235775]: 2025-10-10 10:28:03.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:28:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:03.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:28:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:28:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:28:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:28:04 np0005479823 nova_compute[235775]: 2025-10-10 10:28:04.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:28:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:05.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:28:05 np0005479823 nova_compute[235775]: 2025-10-10 10:28:05.810 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:28:05 np0005479823 nova_compute[235775]: 2025-10-10 10:28:05.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:28:05 np0005479823 nova_compute[235775]: 2025-10-10 10:28:05.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct 10 06:28:05 np0005479823 nova_compute[235775]: 2025-10-10 10:28:05.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct 10 06:28:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:05.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:05 np0005479823 nova_compute[235775]: 2025-10-10 10:28:05.847 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct 10 06:28:05 np0005479823 nova_compute[235775]: 2025-10-10 10:28:05.847 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:28:05 np0005479823 nova_compute[235775]: 2025-10-10 10:28:05.848 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:28:05 np0005479823 nova_compute[235775]: 2025-10-10 10:28:05.877 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:28:05 np0005479823 nova_compute[235775]: 2025-10-10 10:28:05.877 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:28:05 np0005479823 nova_compute[235775]: 2025-10-10 10:28:05.877 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:28:05 np0005479823 nova_compute[235775]: 2025-10-10 10:28:05.878 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct 10 06:28:05 np0005479823 nova_compute[235775]: 2025-10-10 10:28:05.878 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:28:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:06 np0005479823 nova_compute[235775]: 2025-10-10 10:28:06.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:28:06 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:28:06 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2704639440' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:28:06 np0005479823 nova_compute[235775]: 2025-10-10 10:28:06.353 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:28:06 np0005479823 nova_compute[235775]: 2025-10-10 10:28:06.487 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct 10 06:28:06 np0005479823 nova_compute[235775]: 2025-10-10 10:28:06.488 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4846MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct 10 06:28:06 np0005479823 nova_compute[235775]: 2025-10-10 10:28:06.489 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:28:06 np0005479823 nova_compute[235775]: 2025-10-10 10:28:06.489 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:28:06 np0005479823 nova_compute[235775]: 2025-10-10 10:28:06.538 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct 10 06:28:06 np0005479823 nova_compute[235775]: 2025-10-10 10:28:06.538 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct 10 06:28:06 np0005479823 nova_compute[235775]: 2025-10-10 10:28:06.552 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:28:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:07 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:28:07 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1632890228' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:28:07 np0005479823 nova_compute[235775]: 2025-10-10 10:28:07.016 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:28:07 np0005479823 nova_compute[235775]: 2025-10-10 10:28:07.021 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct 10 06:28:07 np0005479823 nova_compute[235775]: 2025-10-10 10:28:07.040 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct 10 06:28:07 np0005479823 nova_compute[235775]: 2025-10-10 10:28:07.042 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct 10 06:28:07 np0005479823 nova_compute[235775]: 2025-10-10 10:28:07.042 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.553s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:28:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:07.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:28:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:07.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:28:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:08 np0005479823 nova_compute[235775]: 2025-10-10 10:28:08.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:28:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:28:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:28:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:28:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:28:09 np0005479823 nova_compute[235775]: 2025-10-10 10:28:09.009 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:28:09 np0005479823 nova_compute[235775]: 2025-10-10 10:28:09.024 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:28:09 np0005479823 nova_compute[235775]: 2025-10-10 10:28:09.025 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct 10 06:28:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:28:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:09.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:28:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:09.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:28:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:28:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:11.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:28:11 np0005479823 nova_compute[235775]: 2025-10-10 10:28:11.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:28:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:11.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:28:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:13.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:28:13 np0005479823 nova_compute[235775]: 2025-10-10 10:28:13.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:28:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:13.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:28:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:28:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:28:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:14 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:28:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:28:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:15.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:28:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:28:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:15.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:16 np0005479823 nova_compute[235775]: 2025-10-10 10:28:16.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:28:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:17.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:17.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:18 np0005479823 nova_compute[235775]: 2025-10-10 10:28:18.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:28:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:28:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:28:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:28:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:28:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:19.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:19 np0005479823 podman[258076]: 2025-10-10 10:28:19.783704805 +0000 UTC m=+0.062533168 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct 10 06:28:19 np0005479823 podman[258078]: 2025-10-10 10:28:19.792562987 +0000 UTC m=+0.066374380 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251001)
Oct 10 06:28:19 np0005479823 podman[258077]: 2025-10-10 10:28:19.8108098 +0000 UTC m=+0.088255050 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct 10 06:28:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:28:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:19.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:28:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:28:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:21.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:21 np0005479823 nova_compute[235775]: 2025-10-10 10:28:21.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:28:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:28:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:21.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:28:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:23.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:23 np0005479823 nova_compute[235775]: 2025-10-10 10:28:23.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:28:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:28:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:23.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:28:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:28:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:28:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:28:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:28:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:25.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:28:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:25.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:26 np0005479823 nova_compute[235775]: 2025-10-10 10:28:26.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:28:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:27.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.516627) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760092107516672, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 1454, "num_deletes": 251, "total_data_size": 3529015, "memory_usage": 3577552, "flush_reason": "Manual Compaction"}
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760092107532642, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 2302543, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 38058, "largest_seqno": 39507, "table_properties": {"data_size": 2296439, "index_size": 3367, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 13187, "raw_average_key_size": 20, "raw_value_size": 2284080, "raw_average_value_size": 3465, "num_data_blocks": 147, "num_entries": 659, "num_filter_entries": 659, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760091985, "oldest_key_time": 1760091985, "file_creation_time": 1760092107, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 16074 microseconds, and 9680 cpu microseconds.
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.532699) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 2302543 bytes OK
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.532723) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.534535) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.534557) EVENT_LOG_v1 {"time_micros": 1760092107534550, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.534580) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 3522258, prev total WAL file size 3522258, number of live WAL files 2.
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.536374) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033303132' seq:72057594037927935, type:22 .. '7061786F730033323634' seq:0, type:0; will stop at (end)
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(2248KB)], [72(11MB)]
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760092107536424, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 14762405, "oldest_snapshot_seqno": -1}
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 6677 keys, 12616330 bytes, temperature: kUnknown
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760092107608701, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 12616330, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12574708, "index_size": 23846, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16709, "raw_key_size": 175413, "raw_average_key_size": 26, "raw_value_size": 12457292, "raw_average_value_size": 1865, "num_data_blocks": 936, "num_entries": 6677, "num_filter_entries": 6677, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760092107, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.609403) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 12616330 bytes
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.610854) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 203.8 rd, 174.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.2, 11.9 +0.0 blob) out(12.0 +0.0 blob), read-write-amplify(11.9) write-amplify(5.5) OK, records in: 7193, records dropped: 516 output_compression: NoCompression
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.610885) EVENT_LOG_v1 {"time_micros": 1760092107610871, "job": 44, "event": "compaction_finished", "compaction_time_micros": 72439, "compaction_time_cpu_micros": 49047, "output_level": 6, "num_output_files": 1, "total_output_size": 12616330, "num_input_records": 7193, "num_output_records": 6677, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760092107612163, "job": 44, "event": "table_file_deletion", "file_number": 74}
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760092107616156, "job": 44, "event": "table_file_deletion", "file_number": 72}
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.536291) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.616285) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.616291) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.616294) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.616297) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:28:27 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:28:27.616300) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:28:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:27.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:28 np0005479823 nova_compute[235775]: 2025-10-10 10:28:28.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:28:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:28:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:28:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:28:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:28:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:28:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:29.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:28:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:29.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:28:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:31.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:31 np0005479823 nova_compute[235775]: 2025-10-10 10:28:31.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:28:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:31.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:32 np0005479823 podman[258178]: 2025-10-10 10:28:32.808301617 +0000 UTC m=+0.072936890 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Oct 10 06:28:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:28:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:33.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:28:33 np0005479823 nova_compute[235775]: 2025-10-10 10:28:33.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:28:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:33.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:28:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:28:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:28:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:28:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:35.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:28:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:28:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:35.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:28:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:36 np0005479823 nova_compute[235775]: 2025-10-10 10:28:36.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:28:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:37.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:28:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:37.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:28:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:38 np0005479823 nova_compute[235775]: 2025-10-10 10:28:38.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:28:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:28:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:28:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:28:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:39 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:28:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:28:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:39.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:28:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:39.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:28:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:41.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:41 np0005479823 nova_compute[235775]: 2025-10-10 10:28:41.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:28:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:28:41.484 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:28:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:28:41.485 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:28:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:28:41.485 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:28:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:41.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:43.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:43 np0005479823 nova_compute[235775]: 2025-10-10 10:28:43.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:28:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:43.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:28:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:28:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:28:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:28:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:28:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:45.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:28:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:28:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:45.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:46 np0005479823 nova_compute[235775]: 2025-10-10 10:28:46.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:28:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:47.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:47.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:48 np0005479823 nova_compute[235775]: 2025-10-10 10:28:48.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:28:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:28:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:28:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:28:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:28:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:28:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:49.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:28:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:49.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:28:50 np0005479823 podman[258243]: 2025-10-10 10:28:50.811726861 +0000 UTC m=+0.064960575 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct 10 06:28:50 np0005479823 podman[258241]: 2025-10-10 10:28:50.811865495 +0000 UTC m=+0.074136538 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 10 06:28:50 np0005479823 podman[258242]: 2025-10-10 10:28:50.848529095 +0000 UTC m=+0.113649419 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Oct 10 06:28:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:51.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:51 np0005479823 nova_compute[235775]: 2025-10-10 10:28:51.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:28:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:51.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:28:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:53.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:28:53 np0005479823 nova_compute[235775]: 2025-10-10 10:28:53.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:28:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:53.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:28:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:28:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:28:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:28:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:55.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:28:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:55.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:56 np0005479823 nova_compute[235775]: 2025-10-10 10:28:56.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:28:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:28:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:57.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:28:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:57.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:58 np0005479823 nova_compute[235775]: 2025-10-10 10:28:58.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:28:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:28:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:28:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:58 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:28:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:28:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:28:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:28:59.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:59 np0005479823 nova_compute[235775]: 2025-10-10 10:28:59.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:28:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:28:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:28:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:28:59.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:28:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:28:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:28:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:28:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:29:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:01.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:01 np0005479823 nova_compute[235775]: 2025-10-10 10:29:01.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:29:01 np0005479823 nova_compute[235775]: 2025-10-10 10:29:01.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:29:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:01.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:03.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:03 np0005479823 nova_compute[235775]: 2025-10-10 10:29:03.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:29:03 np0005479823 podman[258425]: 2025-10-10 10:29:03.281690604 +0000 UTC m=+0.054956265 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct 10 06:29:03 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 06:29:03 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:29:03 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:29:03 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 06:29:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:29:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:03.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:29:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:29:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:29:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:29:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:29:04 np0005479823 nova_compute[235775]: 2025-10-10 10:29:04.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:29:04 np0005479823 nova_compute[235775]: 2025-10-10 10:29:04.815 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:29:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:29:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:05.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:29:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:29:05 np0005479823 nova_compute[235775]: 2025-10-10 10:29:05.810 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:29:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:05.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:06 np0005479823 nova_compute[235775]: 2025-10-10 10:29:06.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:29:06 np0005479823 nova_compute[235775]: 2025-10-10 10:29:06.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:29:06 np0005479823 nova_compute[235775]: 2025-10-10 10:29:06.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct 10 06:29:06 np0005479823 nova_compute[235775]: 2025-10-10 10:29:06.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct 10 06:29:06 np0005479823 nova_compute[235775]: 2025-10-10 10:29:06.831 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct 10 06:29:06 np0005479823 nova_compute[235775]: 2025-10-10 10:29:06.831 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:29:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:07.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:07 np0005479823 nova_compute[235775]: 2025-10-10 10:29:07.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:29:07 np0005479823 nova_compute[235775]: 2025-10-10 10:29:07.841 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:29:07 np0005479823 nova_compute[235775]: 2025-10-10 10:29:07.841 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:29:07 np0005479823 nova_compute[235775]: 2025-10-10 10:29:07.842 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:29:07 np0005479823 nova_compute[235775]: 2025-10-10 10:29:07.842 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct 10 06:29:07 np0005479823 nova_compute[235775]: 2025-10-10 10:29:07.842 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:29:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:07.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:08 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:29:08 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:29:08 np0005479823 nova_compute[235775]: 2025-10-10 10:29:08.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:29:08 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:29:08 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2349870666' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:29:08 np0005479823 nova_compute[235775]: 2025-10-10 10:29:08.296 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:29:08 np0005479823 nova_compute[235775]: 2025-10-10 10:29:08.441 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct 10 06:29:08 np0005479823 nova_compute[235775]: 2025-10-10 10:29:08.443 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4842MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct 10 06:29:08 np0005479823 nova_compute[235775]: 2025-10-10 10:29:08.443 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:29:08 np0005479823 nova_compute[235775]: 2025-10-10 10:29:08.443 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:29:08 np0005479823 nova_compute[235775]: 2025-10-10 10:29:08.675 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct 10 06:29:08 np0005479823 nova_compute[235775]: 2025-10-10 10:29:08.676 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct 10 06:29:08 np0005479823 nova_compute[235775]: 2025-10-10 10:29:08.717 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:29:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:29:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:29:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:29:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:29:09 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:29:09 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2057179231' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:29:09 np0005479823 nova_compute[235775]: 2025-10-10 10:29:09.144 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:29:09 np0005479823 nova_compute[235775]: 2025-10-10 10:29:09.151 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct 10 06:29:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:29:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:09.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:29:09 np0005479823 nova_compute[235775]: 2025-10-10 10:29:09.168 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct 10 06:29:09 np0005479823 nova_compute[235775]: 2025-10-10 10:29:09.170 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct 10 06:29:09 np0005479823 nova_compute[235775]: 2025-10-10 10:29:09.170 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.727s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:29:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:09.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:29:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:11.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:11 np0005479823 nova_compute[235775]: 2025-10-10 10:29:11.171 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:29:11 np0005479823 nova_compute[235775]: 2025-10-10 10:29:11.172 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct 10 06:29:11 np0005479823 nova_compute[235775]: 2025-10-10 10:29:11.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:29:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:29:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:11.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:29:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:29:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:13.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:29:13 np0005479823 nova_compute[235775]: 2025-10-10 10:29:13.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:29:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:13.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:29:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:29:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:29:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:14 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:29:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:29:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:15.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:29:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:29:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:15.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:16 np0005479823 nova_compute[235775]: 2025-10-10 10:29:16.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:29:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:17.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:17.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:18 np0005479823 nova_compute[235775]: 2025-10-10 10:29:18.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:29:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:29:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:29:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:29:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:29:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:29:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:19.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:29:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:29:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:19.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:29:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:29:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:29:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:21.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:29:21 np0005479823 nova_compute[235775]: 2025-10-10 10:29:21.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:29:21 np0005479823 podman[258536]: 2025-10-10 10:29:21.797625852 +0000 UTC m=+0.058807480 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid)
Oct 10 06:29:21 np0005479823 podman[258534]: 2025-10-10 10:29:21.809843312 +0000 UTC m=+0.075784272 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 10 06:29:21 np0005479823 podman[258535]: 2025-10-10 10:29:21.834623533 +0000 UTC m=+0.100466370 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 10 06:29:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:21.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:23.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:23 np0005479823 nova_compute[235775]: 2025-10-10 10:29:23.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:29:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:23.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:29:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:29:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:29:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:29:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:29:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:25.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:29:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:29:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:25.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:26 np0005479823 nova_compute[235775]: 2025-10-10 10:29:26.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:29:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Oct 10 06:29:26 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1412912917' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 06:29:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Oct 10 06:29:26 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1412912917' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 06:29:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:29:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:27.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:29:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:27.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:28 np0005479823 nova_compute[235775]: 2025-10-10 10:29:28.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:29:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:29:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:29:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:29:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:29:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:29:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:29.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:29:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:29.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:29:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:31.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:31 np0005479823 nova_compute[235775]: 2025-10-10 10:29:31.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:29:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:31.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:29:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:33.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:29:33 np0005479823 nova_compute[235775]: 2025-10-10 10:29:33.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:29:33 np0005479823 podman[258634]: 2025-10-10 10:29:33.771432493 +0000 UTC m=+0.044653657 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Oct 10 06:29:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:29:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:33.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:29:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:29:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:29:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:29:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:29:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:35.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:29:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:35.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:36 np0005479823 nova_compute[235775]: 2025-10-10 10:29:36.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:29:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:37.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:37.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:38 np0005479823 nova_compute[235775]: 2025-10-10 10:29:38.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:29:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:29:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:29:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:29:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:39 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:29:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:29:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:39.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:29:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:39.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:29:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:41.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:41 np0005479823 nova_compute[235775]: 2025-10-10 10:29:41.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:29:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:29:41.485 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:29:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:29:41.485 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:29:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:29:41.485 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:29:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:41.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:29:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:43.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:29:43 np0005479823 nova_compute[235775]: 2025-10-10 10:29:43.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:29:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:43.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:29:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:29:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:29:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:29:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.002000065s ======
Oct 10 06:29:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:45.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000065s
Oct 10 06:29:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:29:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:29:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:45.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:29:46 np0005479823 nova_compute[235775]: 2025-10-10 10:29:46.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:29:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:47.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:29:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:47.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:29:48 np0005479823 nova_compute[235775]: 2025-10-10 10:29:48.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:29:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:29:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:29:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:29:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:29:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:29:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:49.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:29:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:29:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:49.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:29:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:29:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:51.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:51 np0005479823 nova_compute[235775]: 2025-10-10 10:29:51.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:29:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:51.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:52 np0005479823 podman[258699]: 2025-10-10 10:29:52.777703196 +0000 UTC m=+0.051922279 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 10 06:29:52 np0005479823 podman[258697]: 2025-10-10 10:29:52.777789338 +0000 UTC m=+0.060299646 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 10 06:29:52 np0005479823 podman[258698]: 2025-10-10 10:29:52.802764146 +0000 UTC m=+0.082379501 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct 10 06:29:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:53.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:53 np0005479823 nova_compute[235775]: 2025-10-10 10:29:53.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:29:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:53.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:29:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:29:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:29:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:29:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:29:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:55.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:29:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:29:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:55.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:56 np0005479823 nova_compute[235775]: 2025-10-10 10:29:56.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:29:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:57.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:57.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:58 np0005479823 nova_compute[235775]: 2025-10-10 10:29:58.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:29:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:58 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:58 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:29:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:29:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:29:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:29:59 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:29:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:29:59.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:29:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:29:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:59 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:29:59 2025: (VI_0) received an invalid passwd!
Oct 10 06:29:59 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:29:59 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:29:59 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:29:59.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:00 np0005479823 ceph-mon[74913]: Health detail: HEALTH_WARN 1 failed cephadm daemon(s)
Oct 10 06:30:00 np0005479823 ceph-mon[74913]: [WRN] CEPHADM_FAILED_DAEMON: 1 failed cephadm daemon(s)
Oct 10 06:30:00 np0005479823 ceph-mon[74913]:    daemon nfs.cephfs.2.0.compute-0.ruydzo on compute-0 is in error state
Oct 10 06:30:00 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:30:00 np0005479823 nova_compute[235775]: 2025-10-10 10:30:00.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:30:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:00 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:00 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:01 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:01 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:01.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:01 np0005479823 nova_compute[235775]: 2025-10-10 10:30:01.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:30:01 np0005479823 nova_compute[235775]: 2025-10-10 10:30:01.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:30:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:01 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:01 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:01 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:02 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:30:02 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:01.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:30:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:02 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:02 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:03 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:03 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:30:03 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:03.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:30:03 np0005479823 nova_compute[235775]: 2025-10-10 10:30:03.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:30:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:03 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:03 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:30:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:30:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:03 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:30:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:04 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:30:04 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:04 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:30:04 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:04.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:30:04 np0005479823 podman[258799]: 2025-10-10 10:30:04.80010275 +0000 UTC m=+0.068654113 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct 10 06:30:04 np0005479823 nova_compute[235775]: 2025-10-10 10:30:04.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:30:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:04 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:04 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:05 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:05 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:05 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:05.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:05 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:30:05 np0005479823 nova_compute[235775]: 2025-10-10 10:30:05.810 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:30:05 np0005479823 nova_compute[235775]: 2025-10-10 10:30:05.813 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:30:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:05 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:05 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:06 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:06 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:06 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:06.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:06 np0005479823 nova_compute[235775]: 2025-10-10 10:30:06.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:30:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:06 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:06 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:07 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:07 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:07 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:07.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:07 np0005479823 nova_compute[235775]: 2025-10-10 10:30:07.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:30:07 np0005479823 nova_compute[235775]: 2025-10-10 10:30:07.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct 10 06:30:07 np0005479823 nova_compute[235775]: 2025-10-10 10:30:07.815 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct 10 06:30:07 np0005479823 nova_compute[235775]: 2025-10-10 10:30:07.846 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct 10 06:30:07 np0005479823 nova_compute[235775]: 2025-10-10 10:30:07.847 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:30:07 np0005479823 nova_compute[235775]: 2025-10-10 10:30:07.874 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:30:07 np0005479823 nova_compute[235775]: 2025-10-10 10:30:07.874 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:30:07 np0005479823 nova_compute[235775]: 2025-10-10 10:30:07.875 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:30:07 np0005479823 nova_compute[235775]: 2025-10-10 10:30:07.875 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct 10 06:30:07 np0005479823 nova_compute[235775]: 2025-10-10 10:30:07.876 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:30:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:07 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:07 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:08 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:08 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:30:08 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:08.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:30:08 np0005479823 nova_compute[235775]: 2025-10-10 10:30:08.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:30:08 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:30:08 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1914201702' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:30:08 np0005479823 nova_compute[235775]: 2025-10-10 10:30:08.384 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:30:08 np0005479823 nova_compute[235775]: 2025-10-10 10:30:08.545 2 WARNING nova.virt.libvirt.driver [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct 10 06:30:08 np0005479823 nova_compute[235775]: 2025-10-10 10:30:08.547 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4828MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct 10 06:30:08 np0005479823 nova_compute[235775]: 2025-10-10 10:30:08.548 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:30:08 np0005479823 nova_compute[235775]: 2025-10-10 10:30:08.548 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:30:08 np0005479823 nova_compute[235775]: 2025-10-10 10:30:08.610 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct 10 06:30:08 np0005479823 nova_compute[235775]: 2025-10-10 10:30:08.610 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct 10 06:30:08 np0005479823 nova_compute[235775]: 2025-10-10 10:30:08.624 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct 10 06:30:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:08 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:08 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:30:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:30:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:08 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:30:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:09 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:30:09 np0005479823 podman[258989]: 2025-10-10 10:30:09.020921756 +0000 UTC m=+0.084800697 container exec bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Oct 10 06:30:09 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Oct 10 06:30:09 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/919033109' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Oct 10 06:30:09 np0005479823 nova_compute[235775]: 2025-10-10 10:30:09.134 2 DEBUG oslo_concurrency.processutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct 10 06:30:09 np0005479823 podman[258989]: 2025-10-10 10:30:09.138305514 +0000 UTC m=+0.202184425 container exec_died bb439f1f2eff735eaba9bea4e75b9fae89531e10b5ccfb501bdc85be3ece4ecd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-mon-compute-2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Oct 10 06:30:09 np0005479823 nova_compute[235775]: 2025-10-10 10:30:09.139 2 DEBUG nova.compute.provider_tree [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed in ProviderTree for provider: dcdfa54c-9f95-46da-9af1-da3e28d81cf0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct 10 06:30:09 np0005479823 nova_compute[235775]: 2025-10-10 10:30:09.158 2 DEBUG nova.scheduler.client.report [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Inventory has not changed for provider dcdfa54c-9f95-46da-9af1-da3e28d81cf0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct 10 06:30:09 np0005479823 nova_compute[235775]: 2025-10-10 10:30:09.160 2 DEBUG nova.compute.resource_tracker [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct 10 06:30:09 np0005479823 nova_compute[235775]: 2025-10-10 10:30:09.160 2 DEBUG oslo_concurrency.lockutils [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.612s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:30:09 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:09 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:09 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:09.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:09 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Oct 10 06:30:09 np0005479823 podman[259110]: 2025-10-10 10:30:09.564079519 +0000 UTC m=+0.056678500 container exec 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 10 06:30:09 np0005479823 podman[259110]: 2025-10-10 10:30:09.596407382 +0000 UTC m=+0.089006353 container exec_died 6e96d50912d7c803cc5cb2b1a31cb3effd738b34d55836b1f069b740e80da23f (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-node-exporter-compute-2, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct 10 06:30:09 np0005479823 podman[259201]: 2025-10-10 10:30:09.94712064 +0000 UTC m=+0.064994416 container exec eac346131ad153d129d5755e1377a2007627c03598a265a99b9e06d18355c13f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Oct 10 06:30:09 np0005479823 podman[259201]: 2025-10-10 10:30:09.961161768 +0000 UTC m=+0.079035454 container exec_died eac346131ad153d129d5755e1377a2007627c03598a265a99b9e06d18355c13f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Oct 10 06:30:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:09 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:09 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:10 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:10 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:10 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:10.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:10 np0005479823 nova_compute[235775]: 2025-10-10 10:30:10.128 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:30:10 np0005479823 nova_compute[235775]: 2025-10-10 10:30:10.145 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:30:10 np0005479823 podman[259266]: 2025-10-10 10:30:10.203753994 +0000 UTC m=+0.064513711 container exec 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0 (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol)
Oct 10 06:30:10 np0005479823 podman[259266]: 2025-10-10 10:30:10.238222295 +0000 UTC m=+0.098981982 container exec_died 5c4e8131f3212550ee6e627ca1a29ed4a019f81ddc50422f6d4dd594a6da28e0 (image=quay.io/ceph/haproxy:2.3, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-haproxy-nfs-cephfs-compute-2-eokdol)
Oct 10 06:30:10 np0005479823 podman[259334]: 2025-10-10 10:30:10.441077591 +0000 UTC m=+0.048342145 container exec 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, io.openshift.expose-services=, release=1793, version=2.2.4, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, architecture=x86_64, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2)
Oct 10 06:30:10 np0005479823 podman[259334]: 2025-10-10 10:30:10.453126106 +0000 UTC m=+0.060390650 container exec_died 0e89ae3d5f811404e87ba2b013f18d1919f90b811696f2c3e9c3c1ad6f5fbb9c (image=quay.io/ceph/keepalived:2.2.4, name=ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, description=keepalived for Ceph, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Oct 10 06:30:10 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:30:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:10 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:10 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:11 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:11 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:11 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:11.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:11 np0005479823 nova_compute[235775]: 2025-10-10 10:30:11.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:30:11 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:30:11 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:30:11 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:30:11 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:30:11 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Oct 10 06:30:11 np0005479823 nova_compute[235775]: 2025-10-10 10:30:11.814 2 DEBUG oslo_service.periodic_task [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct 10 06:30:11 np0005479823 nova_compute[235775]: 2025-10-10 10:30:11.814 2 DEBUG nova.compute.manager [None req-dfef3a0c-9b50-45cd-af12-950076e41671 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct 10 06:30:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:11 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:11 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:12 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:12 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:12 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:12.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:12 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Oct 10 06:30:12 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Oct 10 06:30:12 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:30:12 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:30:12 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Oct 10 06:30:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:12 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:12 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:13 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:13 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:13 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:13.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:13 np0005479823 nova_compute[235775]: 2025-10-10 10:30:13.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:30:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:13 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:13 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:30:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:30:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:13 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:30:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:14 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:30:14 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:14 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:30:14 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:14.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:30:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:14 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:14 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:15 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:15 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:15 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:15.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:15 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:30:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:15 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:15 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:16 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:16 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:16 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:16.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:16 np0005479823 nova_compute[235775]: 2025-10-10 10:30:16.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:30:16 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:30:16 np0005479823 ceph-mon[74913]: from='mgr.14709 192.168.122.100:0/3269626124' entity='mgr.compute-0.xkdepb' 
Oct 10 06:30:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:16 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:16 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:17 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:17 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:17 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:17.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:17 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:17 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:18 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:18 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:18 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:18.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:18 np0005479823 nova_compute[235775]: 2025-10-10 10:30:18.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:30:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:18 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:18 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:30:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:30:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:18 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:30:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:19 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:30:19 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:19 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:30:19 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:19.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:30:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:19 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:19 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:20 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:20 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:20 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:20.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:20 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:30:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:20 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:20 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:21 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:21 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:21 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:21.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:21 np0005479823 nova_compute[235775]: 2025-10-10 10:30:21.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:30:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:21 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:21 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:22 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:22 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:22 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:22.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:22 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:22 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:23 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:23 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:23 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:23.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:23 np0005479823 nova_compute[235775]: 2025-10-10 10:30:23.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:30:23 np0005479823 podman[259549]: 2025-10-10 10:30:23.695549134 +0000 UTC m=+0.063740137 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct 10 06:30:23 np0005479823 podman[259547]: 2025-10-10 10:30:23.70386793 +0000 UTC m=+0.070508623 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Oct 10 06:30:23 np0005479823 podman[259548]: 2025-10-10 10:30:23.728872748 +0000 UTC m=+0.097097962 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct 10 06:30:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:23 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:23 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:30:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:30:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:23 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:30:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:24 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:30:24 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:24 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:24 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:24.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:24 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:24 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:25 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:25 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:25 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:25.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:25 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:30:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:25 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:25 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:26 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:26 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:26 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:26.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:26 np0005479823 nova_compute[235775]: 2025-10-10 10:30:26.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:30:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Oct 10 06:30:26 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1496174560' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Oct 10 06:30:26 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Oct 10 06:30:26 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1496174560' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Oct 10 06:30:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:26 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:26 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:27 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:27 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:27 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:27.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:27 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:27 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:28 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:28 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:28 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:28.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:28 np0005479823 nova_compute[235775]: 2025-10-10 10:30:28.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:30:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:28 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:28 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:30:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:30:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:28 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:30:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:29 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:30:29 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:29 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:30:29 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:29.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:30:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:29 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:29 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:30 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:30 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:30:30 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:30.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:30:30 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:30:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:30 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:30 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:31 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:31 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:30:31 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:31.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:30:31 np0005479823 nova_compute[235775]: 2025-10-10 10:30:31.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:30:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:31 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:31 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:32 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:32 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:32 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:32.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:32 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:32 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:33 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:33 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:30:33 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:33.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:30:33 np0005479823 nova_compute[235775]: 2025-10-10 10:30:33.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:30:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:33 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:33 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:30:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:30:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:33 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:30:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:34 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:30:34 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:34 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:30:34 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:34.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:30:34 np0005479823 systemd-logind[796]: New session 60 of user zuul.
Oct 10 06:30:34 np0005479823 systemd[1]: Started Session 60 of User zuul.
Oct 10 06:30:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:34 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:34 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:35 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:35 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:35 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:35.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:35 np0005479823 podman[259662]: 2025-10-10 10:30:35.327893293 +0000 UTC m=+0.081102200 container health_status 2071d93f2b2f022757a53c04bd8ce42081b7cd2a540c93af85d632781575094d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 10 06:30:35 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:30:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:35 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:35 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:36 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:36 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:36 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:36.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:36 np0005479823 nova_compute[235775]: 2025-10-10 10:30:36.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:30:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:36 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:36 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:37 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:37 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:30:37 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:37.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:30:37 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Oct 10 06:30:37 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1165768626' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Oct 10 06:30:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:37 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:37 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:38 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:38 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000033s ======
Oct 10 06:30:38 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:38.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Oct 10 06:30:38 np0005479823 nova_compute[235775]: 2025-10-10 10:30:38.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:30:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:38 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:38 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:30:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:30:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:38 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:30:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:39 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:30:39 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:39 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:30:39 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:39.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0.
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.669300) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760092239669368, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 1594, "num_deletes": 258, "total_data_size": 3890492, "memory_usage": 3937872, "flush_reason": "Manual Compaction"}
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760092239686596, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 2541098, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 39513, "largest_seqno": 41101, "table_properties": {"data_size": 2534363, "index_size": 3806, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1861, "raw_key_size": 14470, "raw_average_key_size": 20, "raw_value_size": 2520675, "raw_average_value_size": 3486, "num_data_blocks": 164, "num_entries": 723, "num_filter_entries": 723, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760092107, "oldest_key_time": 1760092107, "file_creation_time": 1760092239, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 17351 microseconds, and 9966 cpu microseconds.
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.686659) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 2541098 bytes OK
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.686683) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.687900) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.687919) EVENT_LOG_v1 {"time_micros": 1760092239687912, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.687941) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 3883108, prev total WAL file size 3883108, number of live WAL files 2.
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.689476) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031303036' seq:72057594037927935, type:22 .. '6C6F676D0031323630' seq:0, type:0; will stop at (end)
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(2481KB)], [75(12MB)]
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760092239689502, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 15157428, "oldest_snapshot_seqno": -1}
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 6866 keys, 14996081 bytes, temperature: kUnknown
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760092239772856, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 14996081, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14950753, "index_size": 27040, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17221, "raw_key_size": 180370, "raw_average_key_size": 26, "raw_value_size": 14827539, "raw_average_value_size": 2159, "num_data_blocks": 1069, "num_entries": 6866, "num_filter_entries": 6866, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1760089519, "oldest_key_time": 0, "file_creation_time": 1760092239, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c3989026-94dc-41dd-a555-ef3b3fd6f1b8", "db_session_id": "2V808MJHDIXUCLJZ1TSV", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}}
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.773057) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 14996081 bytes
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.774147) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 181.7 rd, 179.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.4, 12.0 +0.0 blob) out(14.3 +0.0 blob), read-write-amplify(11.9) write-amplify(5.9) OK, records in: 7400, records dropped: 534 output_compression: NoCompression
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.774162) EVENT_LOG_v1 {"time_micros": 1760092239774154, "job": 46, "event": "compaction_finished", "compaction_time_micros": 83420, "compaction_time_cpu_micros": 26259, "output_level": 6, "num_output_files": 1, "total_output_size": 14996081, "num_input_records": 7400, "num_output_records": 6866, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760092239774629, "job": 46, "event": "table_file_deletion", "file_number": 77}
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1760092239776317, "job": 46, "event": "table_file_deletion", "file_number": 75}
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.689417) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.776357) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.776362) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.776363) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.776365) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:30:39 np0005479823 ceph-mon[74913]: rocksdb: (Original Log Time 2025/10/10-10:30:39.776366) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Oct 10 06:30:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:39 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:39 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:40 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:40 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:40 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:40.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:40 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:30:40 np0005479823 ovs-vsctl[259968]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Oct 10 06:30:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:40 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:40 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:41 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:41 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:41 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:41.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:41 np0005479823 nova_compute[235775]: 2025-10-10 10:30:41.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:30:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:30:41.485 141795 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct 10 06:30:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:30:41.486 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct 10 06:30:41 np0005479823 ovn_metadata_agent[141790]: 2025-10-10 10:30:41.486 141795 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct 10 06:30:41 np0005479823 virtqemud[235088]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Oct 10 06:30:41 np0005479823 virtqemud[235088]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Oct 10 06:30:41 np0005479823 virtqemud[235088]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct 10 06:30:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:41 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:41 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:42 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: cache status {prefix=cache status} (starting...)
Oct 10 06:30:42 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:42 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:42 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:42.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:42 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: client ls {prefix=client ls} (starting...)
Oct 10 06:30:42 np0005479823 lvm[260315]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Oct 10 06:30:42 np0005479823 lvm[260315]: VG ceph_vg0 finished
Oct 10 06:30:42 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: damage ls {prefix=damage ls} (starting...)
Oct 10 06:30:42 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Oct 10 06:30:42 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/561687218' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Oct 10 06:30:42 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: dump loads {prefix=dump loads} (starting...)
Oct 10 06:30:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:42 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:42 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:43 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Oct 10 06:30:43 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Oct 10 06:30:43 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:43 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:43 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:43.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:43 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Oct 10 06:30:43 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1504914682' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Oct 10 06:30:43 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Oct 10 06:30:43 np0005479823 nova_compute[235775]: 2025-10-10 10:30:43.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:30:43 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Oct 10 06:30:43 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Oct 10 06:30:43 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/171472224' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Oct 10 06:30:43 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Oct 10 06:30:43 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: get subtrees {prefix=get subtrees} (starting...)
Oct 10 06:30:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:43 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:43 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:43 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:30:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:30:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:30:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:44 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:30:44 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:44 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:30:44 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:44.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:30:44 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: ops {prefix=ops} (starting...)
Oct 10 06:30:44 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Oct 10 06:30:44 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2269363297' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Oct 10 06:30:44 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Oct 10 06:30:44 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/17616546' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Oct 10 06:30:44 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Oct 10 06:30:44 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1752163504' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Oct 10 06:30:44 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: session ls {prefix=session ls} (starting...)
Oct 10 06:30:44 np0005479823 ceph-mds[84723]: mds.cephfs.compute-2.vlgajy asok_command: status {prefix=status} (starting...)
Oct 10 06:30:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:44 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:44 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:45 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:45 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:45 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:45.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Oct 10 06:30:45 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3206344806' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Oct 10 06:30:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:30:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Oct 10 06:30:45 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2827576184' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Oct 10 06:30:45 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Oct 10 06:30:45 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2830212554' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Oct 10 06:30:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:45 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:45 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:46 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Oct 10 06:30:46 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2107248748' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Oct 10 06:30:46 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:46 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:46 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:46.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:46 np0005479823 nova_compute[235775]: 2025-10-10 10:30:46.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:30:46 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Oct 10 06:30:46 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3723288549' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Oct 10 06:30:46 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Oct 10 06:30:46 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2575731666' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Oct 10 06:30:46 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Oct 10 06:30:46 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1715209047' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Oct 10 06:30:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:46 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:46 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:47 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Oct 10 06:30:47 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/910388127' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Oct 10 06:30:47 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:47 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:47 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:47.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:47 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Oct 10 06:30:47 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4142265369' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Oct 10 06:30:47 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Oct 10 06:30:47 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1724959652' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Oct 10 06:30:47 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Oct 10 06:30:47 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1085433634' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Oct 10 06:30:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:47 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:47 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:48 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:48 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:48 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:48.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1032192 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1032192 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1032192 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71278592 unmapped: 1032192 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71131136 unmapped: 1179648 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71147520 unmapped: 1163264 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71155712 unmapped: 1155072 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71155712 unmapped: 1155072 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71163904 unmapped: 1146880 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71163904 unmapped: 1146880 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71163904 unmapped: 1146880 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71172096 unmapped: 1138688 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71196672 unmapped: 1114112 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71204864 unmapped: 1105920 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71213056 unmapped: 1097728 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71213056 unmapped: 1097728 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 1081344 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 1081344 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 1081344 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 1081344 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 1081344 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71229440 unmapped: 1081344 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71237632 unmapped: 1073152 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71245824 unmapped: 1064960 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb229ed400 session 0x55cb257c9860
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71262208 unmapped: 1048576 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 rsyslogd[1001]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71286784 unmapped: 1024000 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb252b0400 session 0x55cb257770e0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 999424 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 999424 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 999424 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71311360 unmapped: 999424 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 889284 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71319552 unmapped: 991232 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 983040 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 983040 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 193.200393677s of 193.215057373s, submitted: 3
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71327744 unmapped: 983040 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 966656 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 888693 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71344128 unmapped: 966656 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71360512 unmapped: 950272 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71360512 unmapped: 950272 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71360512 unmapped: 950272 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71368704 unmapped: 942080 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890205 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71368704 unmapped: 942080 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71368704 unmapped: 942080 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71368704 unmapped: 942080 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890205 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890205 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71376896 unmapped: 933888 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 917504 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890205 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 917504 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 917504 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb23c79400 session 0x55cb24f1a3c0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71393280 unmapped: 917504 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 909312 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71401472 unmapped: 909312 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890205 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71409664 unmapped: 901120 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71409664 unmapped: 901120 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71409664 unmapped: 901120 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 890205 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 34.551929474s of 34.587165833s, submitted: 2
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 891717 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71417856 unmapped: 892928 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 894150 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 894150 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb252b1800 session 0x55cb25234960
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 894150 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 894150 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71434240 unmapped: 876544 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71442432 unmapped: 868352 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71442432 unmapped: 868352 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71442432 unmapped: 868352 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 27.263769150s of 27.276128769s, submitted: 4
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897174 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897174 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895992 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71483392 unmapped: 827392 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 819200 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895992 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 819200 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 819200 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb22622400 session 0x55cb23c641e0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 819200 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71491584 unmapped: 819200 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895992 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895992 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb222fb800 session 0x55cb252354a0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71507968 unmapped: 802816 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71516160 unmapped: 794624 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71516160 unmapped: 794624 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895992 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71516160 unmapped: 794624 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71516160 unmapped: 794624 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71516160 unmapped: 794624 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 786432 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 786432 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895992 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 786432 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 786432 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 786432 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71524352 unmapped: 786432 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71540736 unmapped: 770048 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 895992 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71540736 unmapped: 770048 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 41.978160858s of 41.991683960s, submitted: 4
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71540736 unmapped: 770048 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71540736 unmapped: 770048 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71548928 unmapped: 761856 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897504 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71557120 unmapped: 753664 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71573504 unmapped: 737280 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71565312 unmapped: 745472 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb252b1c00 session 0x55cb2397f4a0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896322 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 53.931362152s of 53.942428589s, submitted: 3
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897834 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71581696 unmapped: 729088 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897834 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897243 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71598080 unmapped: 712704 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897243 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 897243 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.023803711s of 22.031444550s, submitted: 2
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71606272 unmapped: 704512 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71622656 unmapped: 688128 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896652 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71622656 unmapped: 688128 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71622656 unmapped: 688128 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71622656 unmapped: 688128 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896652 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896652 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb2515f000 session 0x55cb256783c0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896652 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71630848 unmapped: 679936 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896652 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 896652 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 29.378730774s of 29.382411957s, submitted: 1
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898164 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898164 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71647232 unmapped: 663552 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898164 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb23c79400 session 0x55cb256ab0e0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898164 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898164 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898164 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71663616 unmapped: 647168 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71680000 unmapped: 630784 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 32.485904694s of 32.491119385s, submitted: 1
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899676 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899676 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71704576 unmapped: 606208 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread fragmentation_score=0.000024 took=0.000092s
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 598016 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 598016 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 598016 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 598016 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 598016 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71712768 unmapped: 598016 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71729152 unmapped: 581632 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71729152 unmapped: 581632 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71737344 unmapped: 573440 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71753728 unmapped: 557056 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 5994 writes, 24K keys, 5994 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 5994 writes, 1097 syncs, 5.46 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 448 writes, 699 keys, 448 commit groups, 1.0 writes per commit group, ingest: 0.23 MB, 0.00 MB/s#012Interval WAL: 448 writes, 217 syncs, 2.06 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55cb21511350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71770112 unmapped: 540672 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71778304 unmapped: 532480 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 524288 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 524288 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 524288 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71786496 unmapped: 524288 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb23d67400 session 0x55cb257c9e00
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 899085 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 116.097770691s of 116.122451782s, submitted: 2
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898494 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71802880 unmapped: 507904 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898494 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898494 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898494 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 898494 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.223886490s of 22.228187561s, submitted: 1
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb222fb800 session 0x55cb25216f00
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71794688 unmapped: 516096 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 71811072 unmapped: 499712 heap: 72310784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 901518 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 73965568 unmapped: 1490944 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 901518 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 901518 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75137024 unmapped: 319488 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.055137634s of 15.676420212s, submitted: 212
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb252b0400 session 0x55cb257c9680
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904542 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 903951 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75112448 unmapped: 344064 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb23d67400 session 0x55cb24a952c0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 903360 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 903360 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75120640 unmapped: 335872 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 903360 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 903360 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75128832 unmapped: 327680 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 29.002698898s of 29.016599655s, submitted: 4
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75145216 unmapped: 311296 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75145216 unmapped: 311296 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75145216 unmapped: 311296 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75145216 unmapped: 311296 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75153408 unmapped: 303104 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb252b1c00 session 0x55cb24f1ab40
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 904281 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 81.090309143s of 81.102882385s, submitted: 3
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 905793 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75169792 unmapped: 286720 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 905202 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 905202 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb23c79400 session 0x55cb25216000
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 905202 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 905202 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75186176 unmapped: 270336 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 28.442283630s of 28.465816498s, submitted: 2
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 908226 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 908226 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 907635 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 907635 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75202560 unmapped: 253952 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 heartbeat osd_stat(store_statfs(0x4fca74000/0x0/0x4ffc00000, data 0xf77c9/0x1a8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 ms_handle_reset con 0x55cb23c79400 session 0x55cb252174a0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.102451324s of 16.118749619s, submitted: 3
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 75218944 unmapped: 237568 heap: 75456512 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 144 handle_osd_map epochs [144,145], i have 144, src has [1,145]
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 145 handle_osd_map epochs [145,145], i have 145, src has [1,145]
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 76349440 unmapped: 155648 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 146 handle_osd_map epochs [146,146], i have 146, src has [1,146]
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 146 ms_handle_reset con 0x55cb23d67400 session 0x55cb237c54a0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 76398592 unmapped: 106496 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 146 handle_osd_map epochs [146,147], i have 146, src has [1,147]
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0xfdb10/0x1b2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 76447744 unmapped: 57344 heap: 76505088 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 147 heartbeat osd_stat(store_statfs(0x4fca64000/0x0/0x4ffc00000, data 0xffae2/0x1b5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926645 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 147 handle_osd_map epochs [147,148], i have 147, src has [1,148]
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 148 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 148 ms_handle_reset con 0x55cb252b1c00 session 0x55cb23dda000
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77488128 unmapped: 65536 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca62000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca62000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929443 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca62000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.733337402s of 13.866815567s, submitted: 54
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930955 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca62000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931459 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca63000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930277 data_alloc: 218103808 data_used: 53248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77504512 unmapped: 49152 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca63000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930429 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca63000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930429 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca63000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca63000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930429 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 148 heartbeat osd_stat(store_statfs(0x4fca63000/0x0/0x4ffc00000, data 0x101bf1/0x1b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77529088 unmapped: 24576 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 148 ms_handle_reset con 0x55cb2515f000 session 0x55cb256aa1e0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 29.409894943s of 29.425935745s, submitted: 4
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 148 ms_handle_reset con 0x55cb25754c00 session 0x55cb256aa5a0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 148 ms_handle_reset con 0x55cb23c79400 session 0x55cb25679e00
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77373440 unmapped: 180224 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Oct 10 06:30:48 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2303646199' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930429 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77373440 unmapped: 180224 heap: 77553664 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 148 handle_osd_map epochs [148,149], i have 148, src has [1,149]
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77373440 unmapped: 1228800 heap: 78602240 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 149 handle_osd_map epochs [149,150], i have 149, src has [1,150]
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 150 heartbeat osd_stat(store_statfs(0x4fca5f000/0x0/0x4ffc00000, data 0x103cdd/0x1bc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 150 ms_handle_reset con 0x55cb23d67400 session 0x55cb24a1e960
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 150 ms_handle_reset con 0x55cb2515f000 session 0x55cb25217860
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 150 ms_handle_reset con 0x55cb252b1c00 session 0x55cb250810e0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 150 ms_handle_reset con 0x55cb25755000 session 0x55cb23ddb860
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 150 ms_handle_reset con 0x55cb23c79400 session 0x55cb23ddb0e0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77832192 unmapped: 11272192 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 77840384 unmapped: 11264000 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 150 handle_osd_map epochs [150,151], i have 150, src has [1,151]
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 78888960 unmapped: 10215424 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c4000/0x0/0x4ffc00000, data 0xa9ae51/0xb56000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1019665 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb24c6a1e0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 78938112 unmapped: 10166272 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 78938112 unmapped: 10166272 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f000 session 0x55cb25080d20
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 78938112 unmapped: 10166272 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 78938112 unmapped: 10166272 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c4000/0x0/0x4ffc00000, data 0xa9ae51/0xb56000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b1c00 session 0x55cb252a34a0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.058108330s of 10.286009789s, submitted: 78
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb257765a0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 78864384 unmapped: 10240000 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1018571 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 78864384 unmapped: 10240000 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c5000/0x0/0x4ffc00000, data 0xa9ae61/0xb57000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c5000/0x0/0x4ffc00000, data 0xa9ae61/0xb57000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 80191488 unmapped: 8912896 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c5000/0x0/0x4ffc00000, data 0xa9ae61/0xb57000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88702976 unmapped: 401408 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88702976 unmapped: 401408 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c5000/0x0/0x4ffc00000, data 0xa9ae61/0xb57000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1085887 data_alloc: 234881024 data_used: 10084352
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c5000/0x0/0x4ffc00000, data 0xa9ae61/0xb57000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1086640 data_alloc: 234881024 data_used: 10084352
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fc0c5000/0x0/0x4ffc00000, data 0xa9ae61/0xb57000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x2fdf9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.044337273s of 12.061671257s, submitted: 5
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 88735744 unmapped: 368640 heap: 89104384 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 95715328 unmapped: 4505600 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 96894976 unmapped: 3325952 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97017856 unmapped: 3203072 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1210611 data_alloc: 234881024 data_used: 10915840
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97026048 unmapped: 3194880 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa060000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97026048 unmapped: 3194880 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97026048 unmapped: 3194880 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97026048 unmapped: 3194880 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97026048 unmapped: 3194880 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1210611 data_alloc: 234881024 data_used: 10915840
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97026048 unmapped: 3194880 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa060000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97083392 unmapped: 3137536 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97083392 unmapped: 3137536 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97083392 unmapped: 3137536 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97083392 unmapped: 3137536 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1211523 data_alloc: 234881024 data_used: 10985472
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa060000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97091584 unmapped: 3129344 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97091584 unmapped: 3129344 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa060000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97091584 unmapped: 3129344 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa060000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97091584 unmapped: 3129344 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97091584 unmapped: 3129344 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1211523 data_alloc: 234881024 data_used: 10985472
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97099776 unmapped: 3121152 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa060000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97099776 unmapped: 3121152 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97099776 unmapped: 3121152 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b1c00 session 0x55cb2397e000
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755800 session 0x55cb2397f2c0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755c00 session 0x55cb2397e960
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97099776 unmapped: 3121152 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.521245956s of 22.787330627s, submitted: 111
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515fc00 session 0x55cb24c590e0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb251034a0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa060000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 96673792 unmapped: 3547136 heap: 100220928 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1207099 data_alloc: 234881024 data_used: 10989568
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb25103680
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515fc00 session 0x55cb25102780
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b1c00 session 0x55cb24a1f860
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755800 session 0x55cb24a1ed20
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755c00 session 0x55cb24a1f0e0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97173504 unmapped: 16695296 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb22a681e0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97173504 unmapped: 16695296 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97173504 unmapped: 16695296 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb2397e3c0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9827000/0x0/0x4ffc00000, data 0x2197e71/0x2255000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9827000/0x0/0x4ffc00000, data 0x2197e71/0x2255000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97173504 unmapped: 16695296 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97173504 unmapped: 16695296 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1274581 data_alloc: 234881024 data_used: 10989568
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97173504 unmapped: 16695296 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9827000/0x0/0x4ffc00000, data 0x2197e71/0x2255000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97173504 unmapped: 16695296 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515fc00 session 0x55cb22a501e0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97165312 unmapped: 16703488 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b1c00 session 0x55cb222bcf00
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97165312 unmapped: 16703488 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97165312 unmapped: 16703488 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755800 session 0x55cb257765a0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.540281296s of 10.644290924s, submitted: 20
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb25776f00
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1280396 data_alloc: 234881024 data_used: 10989568
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97419264 unmapped: 16449536 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97042432 unmapped: 16826368 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9801000/0x0/0x4ffc00000, data 0x21bbea4/0x227b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97042432 unmapped: 16826368 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9801000/0x0/0x4ffc00000, data 0x21bbea4/0x227b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 100958208 unmapped: 12910592 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9801000/0x0/0x4ffc00000, data 0x21bbea4/0x227b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105037824 unmapped: 8830976 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1337852 data_alloc: 234881024 data_used: 19390464
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105037824 unmapped: 8830976 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9801000/0x0/0x4ffc00000, data 0x21bbea4/0x227b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105070592 unmapped: 8798208 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105070592 unmapped: 8798208 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105070592 unmapped: 8798208 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9801000/0x0/0x4ffc00000, data 0x21bbea4/0x227b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105103360 unmapped: 8765440 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1339364 data_alloc: 234881024 data_used: 19390464
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.622159004s of 10.657759666s, submitted: 8
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105103360 unmapped: 8765440 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105103360 unmapped: 8765440 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9801000/0x0/0x4ffc00000, data 0x21bbea4/0x227b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9801000/0x0/0x4ffc00000, data 0x21bbea4/0x227b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105103360 unmapped: 8765440 heap: 113868800 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111075328 unmapped: 3850240 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111214592 unmapped: 3710976 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1423561 data_alloc: 234881024 data_used: 20221952
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8ead000/0x0/0x4ffc00000, data 0x2b07ea4/0x2bc7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111443968 unmapped: 3481600 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8ea5000/0x0/0x4ffc00000, data 0x2b0fea4/0x2bcf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111443968 unmapped: 3481600 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111640576 unmapped: 3284992 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111640576 unmapped: 3284992 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111648768 unmapped: 3276800 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8ea5000/0x0/0x4ffc00000, data 0x2b0fea4/0x2bcf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1423561 data_alloc: 234881024 data_used: 20221952
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111648768 unmapped: 3276800 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.157508850s of 10.318835258s, submitted: 79
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8ea5000/0x0/0x4ffc00000, data 0x2b0fea4/0x2bcf000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111673344 unmapped: 3252224 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111673344 unmapped: 3252224 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb257774a0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515fc00 session 0x55cb22e632c0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111673344 unmapped: 3252224 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b1c00 session 0x55cb25235860
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 102645760 unmapped: 12279808 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220056 data_alloc: 234881024 data_used: 10858496
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 102645760 unmapped: 12279808 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa067000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 102645760 unmapped: 12279808 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa067000/0x0/0x4ffc00000, data 0x1957e61/0x1a14000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 102645760 unmapped: 12279808 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 102645760 unmapped: 12279808 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb252165a0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23c79400 session 0x55cb251023c0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 102637568 unmapped: 12288000 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 969360 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb23c64d20
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93929472 unmapped: 20996096 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 968651 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 968651 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 968651 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 968651 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93454336 unmapped: 21471232 heap: 114925568 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 968651 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8b9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 34.769847870s of 34.920715332s, submitted: 64
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97558528 unmapped: 28000256 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb256aba40
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb256aa960
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa7c3000/0x0/0x4ffc00000, data 0x11fedef/0x12b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1091241 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa7c3000/0x0/0x4ffc00000, data 0x11fedef/0x12b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb256aad20
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93888512 unmapped: 31670272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1091241 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93896704 unmapped: 31662080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 93896704 unmapped: 31662080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa7c3000/0x0/0x4ffc00000, data 0x11fedef/0x12b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99975168 unmapped: 25583616 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99975168 unmapped: 25583616 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99975168 unmapped: 25583616 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1201137 data_alloc: 234881024 data_used: 12959744
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99975168 unmapped: 25583616 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99975168 unmapped: 25583616 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa7c3000/0x0/0x4ffc00000, data 0x11fedef/0x12b9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99983360 unmapped: 25575424 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99983360 unmapped: 25575424 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99983360 unmapped: 25575424 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1201137 data_alloc: 234881024 data_used: 12959744
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99991552 unmapped: 25567232 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 100007936 unmapped: 25550848 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 21.422227859s of 21.509000778s, submitted: 20
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106987520 unmapped: 18571264 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a14000/0x0/0x4ffc00000, data 0x1faddef/0x2068000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108470272 unmapped: 17088512 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a14000/0x0/0x4ffc00000, data 0x1faddef/0x2068000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108478464 unmapped: 17080320 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1319733 data_alloc: 234881024 data_used: 14139392
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108486656 unmapped: 17072128 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108486656 unmapped: 17072128 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f996e000/0x0/0x4ffc00000, data 0x2053def/0x210e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108486656 unmapped: 17072128 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f996e000/0x0/0x4ffc00000, data 0x2053def/0x210e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108486656 unmapped: 17072128 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107167744 unmapped: 18391040 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1311406 data_alloc: 234881024 data_used: 14139392
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107167744 unmapped: 18391040 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107167744 unmapped: 18391040 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f994a000/0x0/0x4ffc00000, data 0x2077def/0x2132000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107167744 unmapped: 18391040 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107175936 unmapped: 18382848 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.874601364s of 12.195711136s, submitted: 124
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108224512 unmapped: 17334272 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1311150 data_alloc: 234881024 data_used: 14139392
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9944000/0x0/0x4ffc00000, data 0x207ddef/0x2138000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108232704 unmapped: 17326080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108232704 unmapped: 17326080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108232704 unmapped: 17326080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9944000/0x0/0x4ffc00000, data 0x207ddef/0x2138000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108232704 unmapped: 17326080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108232704 unmapped: 17326080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1311150 data_alloc: 234881024 data_used: 14139392
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24a605a0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108232704 unmapped: 17326080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108232704 unmapped: 17326080 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108240896 unmapped: 17317888 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9941000/0x0/0x4ffc00000, data 0x2080def/0x213b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108240896 unmapped: 17317888 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108240896 unmapped: 17317888 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1311694 data_alloc: 234881024 data_used: 14151680
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108240896 unmapped: 17317888 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108240896 unmapped: 17317888 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108240896 unmapped: 17317888 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.061926842s of 14.079800606s, submitted: 4
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9930000/0x0/0x4ffc00000, data 0x2091def/0x214c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 17195008 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9930000/0x0/0x4ffc00000, data 0x2091def/0x214c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb25678f00
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 108371968 unmapped: 17186816 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 982200 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23c79400 session 0x55cb25824b40
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa95b000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98844672 unmapped: 26714112 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98254848 unmapped: 27303936 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa95b000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98254848 unmapped: 27303936 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98254848 unmapped: 27303936 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa95b000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983712 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984032 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984032 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 nova_compute[235775]: 2025-10-10 10:30:48.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984032 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984032 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98263040 unmapped: 27295744 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb24f1b4a0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515fc00 session 0x55cb24f1ad20
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24f1a3c0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23c79400 session 0x55cb24f1a780
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 28.527706146s of 28.564867020s, submitted: 20
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 100900864 unmapped: 24657920 heap: 125558784 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb24f1a1e0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb8ba000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98394112 unmapped: 30842880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98394112 unmapped: 30842880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98394112 unmapped: 30842880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065364 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb249925a0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98394112 unmapped: 30842880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b1c00 session 0x55cb24992f00
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 98213888 unmapped: 31023104 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24c6be00
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23c79400 session 0x55cb24c6a1e0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faea5000/0x0/0x4ffc00000, data 0xb1cdef/0xbd7000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97878016 unmapped: 31358976 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97878016 unmapped: 31358976 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 97427456 unmapped: 31809536 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1092582 data_alloc: 218103808 data_used: 3399680
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99352576 unmapped: 29884416 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99434496 unmapped: 29802496 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fae80000/0x0/0x4ffc00000, data 0xb40dff/0xbfc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99467264 unmapped: 29769728 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fae80000/0x0/0x4ffc00000, data 0xb40dff/0xbfc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99500032 unmapped: 29736960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99500032 unmapped: 29736960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1131798 data_alloc: 234881024 data_used: 9220096
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99500032 unmapped: 29736960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99500032 unmapped: 29736960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fae80000/0x0/0x4ffc00000, data 0xb40dff/0xbfc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99500032 unmapped: 29736960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99500032 unmapped: 29736960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fae80000/0x0/0x4ffc00000, data 0xb40dff/0xbfc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fae80000/0x0/0x4ffc00000, data 0xb40dff/0xbfc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 99500032 unmapped: 29736960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1133926 data_alloc: 234881024 data_used: 9277440
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.591819763s of 18.735073090s, submitted: 24
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fae80000/0x0/0x4ffc00000, data 0xb40dff/0xbfc000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x417f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107536384 unmapped: 21700608 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107692032 unmapped: 21544960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107692032 unmapped: 21544960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107692032 unmapped: 21544960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107692032 unmapped: 21544960 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1232606 data_alloc: 234881024 data_used: 9793536
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9f04000/0x0/0x4ffc00000, data 0x16abdff/0x1767000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9f04000/0x0/0x4ffc00000, data 0x16abdff/0x1767000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107724800 unmapped: 21512192 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9f04000/0x0/0x4ffc00000, data 0x16abdff/0x1767000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106569728 unmapped: 22667264 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ee1000/0x0/0x4ffc00000, data 0x16cfdff/0x178b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106586112 unmapped: 22650880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25754800 session 0x55cb25081680
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106586112 unmapped: 22650880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106586112 unmapped: 22650880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1230270 data_alloc: 234881024 data_used: 9854976
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106586112 unmapped: 22650880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ee1000/0x0/0x4ffc00000, data 0x16cfdff/0x178b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106586112 unmapped: 22650880 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.891391754s of 12.132454872s, submitted: 125
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106610688 unmapped: 22626304 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ed7000/0x0/0x4ffc00000, data 0x16d9dff/0x1795000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106610688 unmapped: 22626304 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106610688 unmapped: 22626304 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1230430 data_alloc: 234881024 data_used: 9854976
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106610688 unmapped: 22626304 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106610688 unmapped: 22626304 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515e400 session 0x55cb24f1ba40
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106610688 unmapped: 22626304 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515ec00 session 0x55cb24a1bc20
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106627072 unmapped: 22609920 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb222bcf00
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f97ba000/0x0/0x4ffc00000, data 0x1df6dff/0x1eb2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107798528 unmapped: 21438464 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1296232 data_alloc: 234881024 data_used: 9854976
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9630000/0x0/0x4ffc00000, data 0x1f80dff/0x203c000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 21405696 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 21405696 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 21405696 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107831296 unmapped: 21405696 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.719927788s of 11.824364662s, submitted: 25
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23c79400 session 0x55cb24a1e960
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f962d000/0x0/0x4ffc00000, data 0x1f83dff/0x203f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107929600 unmapped: 21307392 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1300762 data_alloc: 234881024 data_used: 9854976
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107921408 unmapped: 21315584 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 14811136 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 14811136 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 14811136 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114425856 unmapped: 14811136 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1360346 data_alloc: 234881024 data_used: 18640896
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f962c000/0x0/0x4ffc00000, data 0x1f83e22/0x2040000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114458624 unmapped: 14778368 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f962c000/0x0/0x4ffc00000, data 0x1f83e22/0x2040000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 14745600 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114565120 unmapped: 14671872 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb2397f2c0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114565120 unmapped: 14671872 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114565120 unmapped: 14671872 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1361578 data_alloc: 234881024 data_used: 18644992
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9629000/0x0/0x4ffc00000, data 0x1f84e22/0x2041000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114565120 unmapped: 14671872 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.852634430s of 11.898006439s, submitted: 19
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122814464 unmapped: 6422528 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 7430144 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8664000/0x0/0x4ffc00000, data 0x2f4be22/0x3008000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 7430144 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121806848 unmapped: 7430144 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1493940 data_alloc: 234881024 data_used: 20471808
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8664000/0x0/0x4ffc00000, data 0x2f4be22/0x3008000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 7397376 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8664000/0x0/0x4ffc00000, data 0x2f4be22/0x3008000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121839616 unmapped: 7397376 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 7389184 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb237c5a40
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515e400 session 0x55cb252350e0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121847808 unmapped: 7389184 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb23c1e5a0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245624 data_alloc: 234881024 data_used: 9854976
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ed2000/0x0/0x4ffc00000, data 0x16dddff/0x1799000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ed2000/0x0/0x4ffc00000, data 0x16dddff/0x1799000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ed2000/0x0/0x4ffc00000, data 0x16dddff/0x1799000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245624 data_alloc: 234881024 data_used: 9854976
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ed2000/0x0/0x4ffc00000, data 0x16dddff/0x1799000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ed2000/0x0/0x4ffc00000, data 0x16dddff/0x1799000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114188288 unmapped: 15048704 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb24c6ad20
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb257163c0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.070486069s of 16.434377670s, submitted: 164
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb237c5c20
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1014831 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1014831 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1014831 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1014831 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1014831 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21479424 heap: 129236992 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 26.723497391s of 26.753862381s, submitted: 18
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb239192c0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb23c1fe00
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb252a21e0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb250814a0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb257c8d20
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106659840 unmapped: 34717696 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1118589 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106659840 unmapped: 34717696 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106659840 unmapped: 34717696 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106659840 unmapped: 34717696 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa704000/0x0/0x4ffc00000, data 0xeace51/0xf68000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106659840 unmapped: 34717696 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb2397e1e0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb2397e3c0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106676224 unmapped: 34701312 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa704000/0x0/0x4ffc00000, data 0xeace51/0xf68000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1118589 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb25678f00
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb257774a0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106135552 unmapped: 35241984 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa6de000/0x0/0x4ffc00000, data 0xed0e84/0xf8e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 106135552 unmapped: 35241984 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 105660416 unmapped: 35717120 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa6de000/0x0/0x4ffc00000, data 0xed0e84/0xf8e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220197 data_alloc: 234881024 data_used: 14155776
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa6de000/0x0/0x4ffc00000, data 0xed0e84/0xf8e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220197 data_alloc: 234881024 data_used: 14155776
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa6de000/0x0/0x4ffc00000, data 0xed0e84/0xf8e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110067712 unmapped: 31309824 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.917535782s of 19.063180923s, submitted: 58
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118210560 unmapped: 23166976 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117637120 unmapped: 23740416 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ab4000/0x0/0x4ffc00000, data 0x1afae84/0x1bb8000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1331335 data_alloc: 234881024 data_used: 14376960
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117637120 unmapped: 23740416 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117637120 unmapped: 23740416 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117637120 unmapped: 23740416 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117637120 unmapped: 23740416 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9aa6000/0x0/0x4ffc00000, data 0x1b08e84/0x1bc6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9aa6000/0x0/0x4ffc00000, data 0x1b08e84/0x1bc6000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117637120 unmapped: 23740416 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1331335 data_alloc: 234881024 data_used: 14376960
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a82000/0x0/0x4ffc00000, data 0x1b2ce84/0x1bea000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.0 total, 600.0 interval#012Cumulative writes: 8276 writes, 33K keys, 8276 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.02 MB/s#012Cumulative WAL: 8276 writes, 2019 syncs, 4.10 writes per sync, written: 0.03 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2282 writes, 8748 keys, 2282 commit groups, 1.0 writes per commit group, ingest: 10.36 MB, 0.02 MB/s#012Interval WAL: 2282 writes, 922 syncs, 2.48 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a82000/0x0/0x4ffc00000, data 0x1b2ce84/0x1bea000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1326463 data_alloc: 234881024 data_used: 14381056
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.047369957s of 13.274172783s, submitted: 111
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a7c000/0x0/0x4ffc00000, data 0x1b32e84/0x1bf0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a7c000/0x0/0x4ffc00000, data 0x1b32e84/0x1bf0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1326375 data_alloc: 234881024 data_used: 14381056
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117645312 unmapped: 23732224 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a6c000/0x0/0x4ffc00000, data 0x1b42e84/0x1c00000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 23633920 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 23633920 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1327543 data_alloc: 234881024 data_used: 14389248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 23633920 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 23633920 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a6c000/0x0/0x4ffc00000, data 0x1b42e84/0x1c00000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 23633920 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117743616 unmapped: 23633920 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb24a1f860
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d02800 session 0x55cb250803c0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb25716b40
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb24f1ab40
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.387310028s of 13.403597832s, submitted: 5
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb24f1b680
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb24c6b680
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23944400 session 0x55cb24c6a960
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb25217860
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118210560 unmapped: 23166976 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb24a61c20
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1397910 data_alloc: 234881024 data_used: 14389248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f91b1000/0x0/0x4ffc00000, data 0x23fbef6/0x24bb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118210560 unmapped: 23166976 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f91b1000/0x0/0x4ffc00000, data 0x23fbef6/0x24bb000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118210560 unmapped: 23166976 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118210560 unmapped: 23166976 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118210560 unmapped: 23166976 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23944400 session 0x55cb257c9680
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118538240 unmapped: 22839296 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1402283 data_alloc: 234881024 data_used: 14389248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f918b000/0x0/0x4ffc00000, data 0x2420ef6/0x24e0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 118554624 unmapped: 22822912 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 121602048 unmapped: 19775488 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126173184 unmapped: 15204352 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126173184 unmapped: 15204352 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f918b000/0x0/0x4ffc00000, data 0x2420ef6/0x24e0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126263296 unmapped: 15114240 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1463991 data_alloc: 234881024 data_used: 23457792
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126263296 unmapped: 15114240 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126263296 unmapped: 15114240 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9188000/0x0/0x4ffc00000, data 0x2424ef6/0x24e4000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126263296 unmapped: 15114240 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126263296 unmapped: 15114240 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126263296 unmapped: 15114240 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1463991 data_alloc: 234881024 data_used: 23457792
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.817728996s of 15.977606773s, submitted: 48
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 126337024 unmapped: 15040512 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130342912 unmapped: 11034624 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131219456 unmapped: 10158080 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8654000/0x0/0x4ffc00000, data 0x2f58ef6/0x3018000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131219456 unmapped: 10158080 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f863e000/0x0/0x4ffc00000, data 0x2f6eef6/0x302e000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131252224 unmapped: 10125312 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1564155 data_alloc: 234881024 data_used: 24436736
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131293184 unmapped: 10084352 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f863b000/0x0/0x4ffc00000, data 0x2f71ef6/0x3031000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1560275 data_alloc: 234881024 data_used: 24440832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f863b000/0x0/0x4ffc00000, data 0x2f71ef6/0x3031000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.395318985s of 12.603665352s, submitted: 111
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130990080 unmapped: 10387456 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8635000/0x0/0x4ffc00000, data 0x2f77ef6/0x3037000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130990080 unmapped: 10387456 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1560555 data_alloc: 234881024 data_used: 24440832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130990080 unmapped: 10387456 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8635000/0x0/0x4ffc00000, data 0x2f77ef6/0x3037000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130998272 unmapped: 10379264 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130998272 unmapped: 10379264 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130998272 unmapped: 10379264 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8632000/0x0/0x4ffc00000, data 0x2f7aef6/0x303a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131039232 unmapped: 10338304 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1560411 data_alloc: 234881024 data_used: 24440832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131039232 unmapped: 10338304 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131039232 unmapped: 10338304 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131039232 unmapped: 10338304 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8632000/0x0/0x4ffc00000, data 0x2f7aef6/0x303a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8632000/0x0/0x4ffc00000, data 0x2f7aef6/0x303a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131072000 unmapped: 10305536 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131072000 unmapped: 10305536 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.019625664s of 12.032996178s, submitted: 5
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1561307 data_alloc: 234881024 data_used: 24440832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131096576 unmapped: 10280960 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131096576 unmapped: 10280960 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131096576 unmapped: 10280960 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131096576 unmapped: 10280960 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8629000/0x0/0x4ffc00000, data 0x2f80ef6/0x3040000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 10272768 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1560963 data_alloc: 234881024 data_used: 24440832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 10272768 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 10272768 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 10272768 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 10272768 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131104768 unmapped: 10272768 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1561267 data_alloc: 234881024 data_used: 24440832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8627000/0x0/0x4ffc00000, data 0x2f84ef6/0x3044000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.661789894s of 10.690342903s, submitted: 10
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131137536 unmapped: 10240000 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8625000/0x0/0x4ffc00000, data 0x2f87ef6/0x3047000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131170304 unmapped: 10207232 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131178496 unmapped: 10199040 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb229ed400 session 0x55cb2397fe00
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131178496 unmapped: 10199040 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8625000/0x0/0x4ffc00000, data 0x2f87ef6/0x3047000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131178496 unmapped: 10199040 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8625000/0x0/0x4ffc00000, data 0x2f87ef6/0x3047000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1561187 data_alloc: 234881024 data_used: 24440832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130637824 unmapped: 10739712 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130637824 unmapped: 10739712 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130646016 unmapped: 10731520 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f861f000/0x0/0x4ffc00000, data 0x2f8def6/0x304d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130727936 unmapped: 10649600 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130924544 unmapped: 10452992 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1560899 data_alloc: 234881024 data_used: 24440832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130924544 unmapped: 10452992 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130932736 unmapped: 10444800 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.915495872s of 11.539477348s, submitted: 233
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130940928 unmapped: 10436608 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f861c000/0x0/0x4ffc00000, data 0x2f90ef6/0x3050000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130940928 unmapped: 10436608 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb24b1d680
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb25755400 session 0x55cb256ab2c0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f861c000/0x0/0x4ffc00000, data 0x2f90ef6/0x3050000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130965504 unmapped: 10412032 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb23dda3c0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1344411 data_alloc: 234881024 data_used: 14389248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f93d1000/0x0/0x4ffc00000, data 0x1b92e84/0x1c50000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 18456576 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 18456576 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 18456576 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f93f2000/0x0/0x4ffc00000, data 0x1b71e84/0x1c2f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 18456576 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 18456576 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1345059 data_alloc: 234881024 data_used: 14389248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f93f2000/0x0/0x4ffc00000, data 0x1b71e84/0x1c2f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 18456576 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122920960 unmapped: 18456576 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb24a1ed20
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb25102b40
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9a3b000/0x0/0x4ffc00000, data 0x1b71e84/0x1c2f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.092136383s of 10.210209846s, submitted: 50
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb25102780
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051980 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051980 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051980 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051980 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 27082752 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1051980 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb275000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114302976 unmapped: 27074560 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114302976 unmapped: 27074560 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114302976 unmapped: 27074560 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114302976 unmapped: 27074560 heap: 141377536 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23944400 session 0x55cb24f36f00
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24f37860
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb24f363c0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb24f374a0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 26.982812881s of 27.189233780s, submitted: 65
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb22e62960
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb257770e0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111869952 unmapped: 33710080 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faaac000/0x0/0x4ffc00000, data 0xb04e51/0xbc0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1129727 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111869952 unmapped: 33710080 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faaac000/0x0/0x4ffc00000, data 0xb04e51/0xbc0000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb257761e0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111869952 unmapped: 33710080 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb257774a0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb25776d20
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111869952 unmapped: 33710080 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb257763c0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111886336 unmapped: 33693696 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111886336 unmapped: 33693696 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1166738 data_alloc: 218103808 data_used: 5058560
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 113631232 unmapped: 31948800 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 113631232 unmapped: 31948800 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faaab000/0x0/0x4ffc00000, data 0xb04e61/0xbc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 113631232 unmapped: 31948800 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 113639424 unmapped: 31940608 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb22e62d20
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.018430710s of 10.130161285s, submitted: 41
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb25080780
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faaab000/0x0/0x4ffc00000, data 0xb04e61/0xbc1000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110166016 unmapped: 35414016 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb25717a40
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1059699 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1059699 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1059699 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 109412352 unmapped: 36167680 heap: 145580032 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb25716f00
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb24b1dc20
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23d67400 session 0x55cb24b1c960
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb23c1e5a0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.854639053s of 12.951243401s, submitted: 33
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb23c1f4a0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb249781e0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb24a61860
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2515f400 session 0x55cb257c9c20
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb257c85a0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110026752 unmapped: 39755776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110026752 unmapped: 39755776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110026752 unmapped: 39755776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1126674 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110026752 unmapped: 39755776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110026752 unmapped: 39755776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb257c8000
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb24a601e0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110026752 unmapped: 39755776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fab9e000/0x0/0x4ffc00000, data 0xa13def/0xace000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb24a605a0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb250e1800 session 0x55cb257770e0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110034944 unmapped: 39747584 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110034944 unmapped: 39747584 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: mgrc ms_handle_reset ms_handle_reset con 0x55cb22623000
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/194506248
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/194506248,v1:192.168.122.100:6801/194506248]
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: mgrc handle_mgr_configure stats_period=5
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1128267 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110100480 unmapped: 39682048 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 113369088 unmapped: 36413440 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fab9e000/0x0/0x4ffc00000, data 0xa13def/0xace000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114548736 unmapped: 35233792 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114548736 unmapped: 35233792 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114548736 unmapped: 35233792 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192259 data_alloc: 234881024 data_used: 9543680
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114548736 unmapped: 35233792 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.485980988s of 13.640996933s, submitted: 24
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb250e1800 session 0x55cb25776d20
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb257772c0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb252350e0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065573 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065573 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065573 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065573 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb4aa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 38846464 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb22a51c20
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb22a503c0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb22a50000
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb222bcf00
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 23.271076202s of 23.354894638s, submitted: 36
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb25080780
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb250e1800 session 0x55cb24b1d680
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb24979a40
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb24f37e00
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb2397f2c0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156560 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faa6e000/0x0/0x4ffc00000, data 0xb42dff/0xbfe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1156560 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faa6e000/0x0/0x4ffc00000, data 0xb42dff/0xbfe000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112074752 unmapped: 37707776 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb23d3e960
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111591424 unmapped: 38191104 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faa4a000/0x0/0x4ffc00000, data 0xb66dff/0xc22000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111591424 unmapped: 38191104 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faa4a000/0x0/0x4ffc00000, data 0xb66dff/0xc22000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182488 data_alloc: 218103808 data_used: 3469312
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faa4a000/0x0/0x4ffc00000, data 0xb66dff/0xc22000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 111968256 unmapped: 37814272 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faa4a000/0x0/0x4ffc00000, data 0xb66dff/0xc22000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1221704 data_alloc: 234881024 data_used: 9289728
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4faa4a000/0x0/0x4ffc00000, data 0xb66dff/0xc22000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 37519360 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.874525070s of 20.991596222s, submitted: 39
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1303724 data_alloc: 234881024 data_used: 9342976
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123199488 unmapped: 26583040 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9cc0000/0x0/0x4ffc00000, data 0x18f0dff/0x19ac000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 26238976 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 26238976 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 26238976 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 26238976 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1373710 data_alloc: 234881024 data_used: 10682368
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 26238976 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123543552 unmapped: 26238976 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f972d000/0x0/0x4ffc00000, data 0x1e83dff/0x1f3f000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 26116096 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 26116096 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 26116096 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1375014 data_alloc: 234881024 data_used: 10694656
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 26116096 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 26116096 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f970f000/0x0/0x4ffc00000, data 0x1ea1dff/0x1f5d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 26116096 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123666432 unmapped: 26116096 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.182755470s of 14.495874405s, submitted: 173
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123781120 unmapped: 26001408 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb23d3f0e0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb250e6000 session 0x55cb257163c0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1374934 data_alloc: 234881024 data_used: 10694656
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb250e6000 session 0x55cb2397e960
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1084648 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1084648 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1084648 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1084648 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa88c000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 114491392 unmapped: 35291136 heap: 149782528 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1084648 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb237c5c20
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb2397e000
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb23c1f0e0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb257772c0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 25.957300186s of 26.056312561s, submitted: 38
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb252b0000 session 0x55cb22e63680
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24a95860
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22622400 session 0x55cb24a1ed20
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22b01000 session 0x55cb22a68d20
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb239183c0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 115048448 unmapped: 46800896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 115048448 unmapped: 46800896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 115048448 unmapped: 46800896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa2f1000/0x0/0x4ffc00000, data 0x12bfdff/0x137b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 115048448 unmapped: 46800896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa2f1000/0x0/0x4ffc00000, data 0x12bfdff/0x137b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 115048448 unmapped: 46800896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb24a1a1e0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1217690 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 115064832 unmapped: 46784512 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 117948416 unmapped: 43900928 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 36888576 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa2f1000/0x0/0x4ffc00000, data 0x12bfdff/0x137b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 36888576 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 36888576 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1334994 data_alloc: 234881024 data_used: 17313792
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 36888576 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 124960768 unmapped: 36888576 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa2f1000/0x0/0x4ffc00000, data 0x12bfdff/0x137b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 36855808 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 124993536 unmapped: 36855808 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fa2f1000/0x0/0x4ffc00000, data 0x12bfdff/0x137b000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x458f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125026304 unmapped: 36823040 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1334994 data_alloc: 234881024 data_used: 17313792
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125034496 unmapped: 36814848 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.868372917s of 16.010391235s, submitted: 40
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 130957312 unmapped: 30892032 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f93b2000/0x0/0x4ffc00000, data 0x1deddff/0x1ea9000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131473408 unmapped: 30375936 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e800 session 0x55cb258243c0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e000 session 0x55cb25825e00
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e400 session 0x55cb258252c0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816fc00 session 0x55cb258250e0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb25824f00
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131596288 unmapped: 30253056 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131678208 unmapped: 30171136 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1516856 data_alloc: 234881024 data_used: 18149376
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131678208 unmapped: 30171136 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e000 session 0x55cb24f374a0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131678208 unmapped: 30171136 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8737000/0x0/0x4ffc00000, data 0x2a68dff/0x2b24000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e400 session 0x55cb24f361e0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131678208 unmapped: 30171136 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131678208 unmapped: 30171136 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e800 session 0x55cb24f37680
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816f000 session 0x55cb24f37a40
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131702784 unmapped: 30146560 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1519196 data_alloc: 234881024 data_used: 18149376
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 131702784 unmapped: 30146560 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f8712000/0x0/0x4ffc00000, data 0x2a8ce32/0x2b4a000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 133881856 unmapped: 27967488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140607488 unmapped: 21241856 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.067866325s of 12.396329880s, submitted: 138
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f870f000/0x0/0x4ffc00000, data 0x2a8fe32/0x2b4d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1595808 data_alloc: 251658240 data_used: 29470720
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f870f000/0x0/0x4ffc00000, data 0x2a8fe32/0x2b4d000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1595808 data_alloc: 251658240 data_used: 29470720
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140632064 unmapped: 21217280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 140648448 unmapped: 21200896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143187968 unmapped: 18661376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f7d06000/0x0/0x4ffc00000, data 0x3498e32/0x3556000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.850776672s of 10.000297546s, submitted: 56
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143392768 unmapped: 18456576 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143532032 unmapped: 18317312 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1680754 data_alloc: 251658240 data_used: 29532160
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143532032 unmapped: 18317312 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143532032 unmapped: 18317312 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143532032 unmapped: 18317312 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143532032 unmapped: 18317312 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f7cd8000/0x0/0x4ffc00000, data 0x34c5e32/0x3583000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f7cd8000/0x0/0x4ffc00000, data 0x34c5e32/0x3583000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143597568 unmapped: 18251776 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1679274 data_alloc: 251658240 data_used: 29532160
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143663104 unmapped: 18186240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143663104 unmapped: 18186240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143663104 unmapped: 18186240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f7cd4000/0x0/0x4ffc00000, data 0x34c9e32/0x3587000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143671296 unmapped: 18178048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143671296 unmapped: 18178048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1679746 data_alloc: 251658240 data_used: 29532160
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143671296 unmapped: 18178048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143671296 unmapped: 18178048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143671296 unmapped: 18178048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f7cd4000/0x0/0x4ffc00000, data 0x34c9e32/0x3587000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143671296 unmapped: 18178048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143671296 unmapped: 18178048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 1679746 data_alloc: 251658240 data_used: 29532160
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f7cd4000/0x0/0x4ffc00000, data 0x34c9e32/0x3587000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.070764542s of 17.136646271s, submitted: 20
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143728640 unmapped: 18120704 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 143728640 unmapped: 18120704 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb257c94a0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e000 session 0x55cb25678960
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 136912896 unmapped: 24936448 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e400 session 0x55cb22a51c20
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 136937472 unmapped: 24911872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f92e4000/0x0/0x4ffc00000, data 0x1ebbdff/0x1f77000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 136937472 unmapped: 24911872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1445716 data_alloc: 234881024 data_used: 18149376
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 136937472 unmapped: 24911872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 136937472 unmapped: 24911872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f92e4000/0x0/0x4ffc00000, data 0x1ebbdff/0x1f77000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22b01000 session 0x55cb252a25a0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24a1be00
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 136953856 unmapped: 24895488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24992960
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114672 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb09a000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb09a000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114672 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb09a000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114672 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb09a000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114672 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb09a000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1114672 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fb09a000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123928576 unmapped: 37920768 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22b01000 session 0x55cb24b1c960
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb24c6a1e0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 123912192 unmapped: 37937152 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e000 session 0x55cb24c6ba40
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e400 session 0x55cb24a941e0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 33.266139984s of 33.476127625s, submitted: 90
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24a95c20
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22b01000 session 0x55cb257c8d20
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb22a51e00
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e000 session 0x55cb25717a40
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb2816e800 session 0x55cb25717860
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162264 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fab77000/0x0/0x4ffc00000, data 0x628e61/0x6e5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162264 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb222fb800 session 0x55cb24a60000
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fab77000/0x0/0x4ffc00000, data 0x628e61/0x6e5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1197892 data_alloc: 218103808 data_used: 5349376
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fab77000/0x0/0x4ffc00000, data 0x628e61/0x6e5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122478592 unmapped: 39370752 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 39362560 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 39362560 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1197892 data_alloc: 218103808 data_used: 5349376
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 39362560 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 122486784 unmapped: 39362560 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4fab77000/0x0/0x4ffc00000, data 0x628e61/0x6e5000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x499f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.718759537s of 18.834480286s, submitted: 44
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125886464 unmapped: 35962880 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129695744 unmapped: 32153600 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129949696 unmapped: 31899648 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1266056 data_alloc: 218103808 data_used: 6639616
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129949696 unmapped: 31899648 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129949696 unmapped: 31899648 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9254000/0x0/0x4ffc00000, data 0xdabe61/0xe68000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 31891456 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 31891456 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 31891456 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1266056 data_alloc: 218103808 data_used: 6639616
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 31891456 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 31891456 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9254000/0x0/0x4ffc00000, data 0xdabe61/0xe68000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129957888 unmapped: 31891456 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.704682350s of 10.921176910s, submitted: 89
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb22b01000 session 0x55cb24a612c0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 129826816 unmapped: 32022528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb251e9800 session 0x55cb25678960
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125779968 unmapped: 36069376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125779968 unmapped: 36069376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125779968 unmapped: 36069376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125779968 unmapped: 36069376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125779968 unmapped: 36069376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125779968 unmapped: 36069376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125788160 unmapped: 36061184 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125788160 unmapped: 36061184 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125788160 unmapped: 36061184 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125788160 unmapped: 36061184 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: do_command 'config diff' '{prefix=config diff}'
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125870080 unmapped: 35979264 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: do_command 'config show' '{prefix=config show}'
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: do_command 'counter dump' '{prefix=counter dump}'
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: do_command 'counter schema' '{prefix=counter schema}'
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125419520 unmapped: 36429824 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125558784 unmapped: 36290560 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: do_command 'log dump' '{prefix=log dump}'
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125566976 unmapped: 36282368 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: do_command 'perf dump' '{prefix=perf dump}'
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: do_command 'perf schema' '{prefix=perf schema}'
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125763584 unmapped: 36085760 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125771776 unmapped: 36077568 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125771776 unmapped: 36077568 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125771776 unmapped: 36077568 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125771776 unmapped: 36077568 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125771776 unmapped: 36077568 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125771776 unmapped: 36077568 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125771776 unmapped: 36077568 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125771776 unmapped: 36077568 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125771776 unmapped: 36077568 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125779968 unmapped: 36069376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125779968 unmapped: 36069376 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125788160 unmapped: 36061184 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125788160 unmapped: 36061184 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125788160 unmapped: 36061184 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125788160 unmapped: 36061184 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125796352 unmapped: 36052992 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125796352 unmapped: 36052992 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125796352 unmapped: 36052992 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125796352 unmapped: 36052992 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125796352 unmapped: 36052992 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125796352 unmapped: 36052992 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125796352 unmapped: 36052992 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125796352 unmapped: 36052992 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125796352 unmapped: 36052992 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125796352 unmapped: 36052992 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125804544 unmapped: 36044800 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125804544 unmapped: 36044800 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125804544 unmapped: 36044800 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125804544 unmapped: 36044800 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125804544 unmapped: 36044800 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125804544 unmapped: 36044800 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125804544 unmapped: 36044800 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125804544 unmapped: 36044800 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125804544 unmapped: 36044800 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125804544 unmapped: 36044800 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125812736 unmapped: 36036608 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125812736 unmapped: 36036608 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125812736 unmapped: 36036608 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125812736 unmapped: 36036608 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125812736 unmapped: 36036608 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125812736 unmapped: 36036608 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125820928 unmapped: 36028416 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125820928 unmapped: 36028416 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125820928 unmapped: 36028416 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125820928 unmapped: 36028416 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125820928 unmapped: 36028416 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125820928 unmapped: 36028416 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125820928 unmapped: 36028416 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 36020224 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125837312 unmapped: 36012032 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125173760 unmapped: 36675584 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125173760 unmapped: 36675584 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125173760 unmapped: 36675584 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125173760 unmapped: 36675584 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125173760 unmapped: 36675584 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125173760 unmapped: 36675584 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125173760 unmapped: 36675584 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125181952 unmapped: 36667392 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 36659200 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 36659200 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 36659200 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 36659200 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125190144 unmapped: 36659200 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 36651008 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 36651008 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 36651008 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 36634624 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 36634624 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.0 total, 600.0 interval#012Cumulative writes: 10K writes, 42K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.02 MB/s#012Cumulative WAL: 10K writes, 3043 syncs, 3.52 writes per sync, written: 0.04 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2448 writes, 8982 keys, 2448 commit groups, 1.0 writes per commit group, ingest: 9.37 MB, 0.02 MB/s#012Interval WAL: 2448 writes, 1024 syncs, 2.39 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 36626432 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 36626432 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 36626432 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 36626432 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 36626432 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 36626432 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 36626432 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 36626432 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125222912 unmapped: 36626432 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 36618240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 36618240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 36618240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 36618240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 36618240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 36618240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 36618240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 36618240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 36618240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 36618240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 36618240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 36618240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 36618240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125231104 unmapped: 36618240 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125239296 unmapped: 36610048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125239296 unmapped: 36610048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125239296 unmapped: 36610048 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 36601856 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 36601856 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 36601856 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 36601856 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 36601856 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 36601856 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 36601856 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 36601856 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 36601856 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 36601856 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 36601856 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125247488 unmapped: 36601856 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125255680 unmapped: 36593664 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125255680 unmapped: 36593664 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125255680 unmapped: 36593664 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125255680 unmapped: 36593664 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125255680 unmapped: 36593664 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125255680 unmapped: 36593664 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125255680 unmapped: 36593664 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125255680 unmapped: 36593664 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125263872 unmapped: 36585472 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125272064 unmapped: 36577280 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125280256 unmapped: 36569088 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125280256 unmapped: 36569088 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125280256 unmapped: 36569088 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125280256 unmapped: 36569088 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125280256 unmapped: 36569088 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125280256 unmapped: 36569088 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125280256 unmapped: 36569088 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124644 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125288448 unmapped: 36560896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125288448 unmapped: 36560896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125288448 unmapped: 36560896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125288448 unmapped: 36560896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125288448 unmapped: 36560896 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9ef9000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 317.378509521s of 317.484130859s, submitted: 36
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125296640 unmapped: 36552704 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [0,1])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125493248 unmapped: 36356096 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125616128 unmapped: 36233216 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125616128 unmapped: 36233216 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125624320 unmapped: 36225024 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125632512 unmapped: 36216832 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125632512 unmapped: 36216832 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125632512 unmapped: 36216832 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125640704 unmapped: 36208640 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125640704 unmapped: 36208640 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125640704 unmapped: 36208640 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125640704 unmapped: 36208640 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125640704 unmapped: 36208640 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125648896 unmapped: 36200448 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125648896 unmapped: 36200448 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125657088 unmapped: 36192256 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125657088 unmapped: 36192256 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125657088 unmapped: 36192256 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125657088 unmapped: 36192256 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125657088 unmapped: 36192256 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125657088 unmapped: 36192256 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125657088 unmapped: 36192256 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125657088 unmapped: 36192256 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125657088 unmapped: 36192256 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 36184064 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 36184064 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 36184064 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 36184064 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125673472 unmapped: 36175872 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125681664 unmapped: 36167680 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125689856 unmapped: 36159488 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125698048 unmapped: 36151296 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125706240 unmapped: 36143104 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125714432 unmapped: 36134912 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 ms_handle_reset con 0x55cb23c79400 session 0x55cb24c585a0
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125722624 unmapped: 36126720 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125730816 unmapped: 36118528 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [3])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125739008 unmapped: 36110336 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125747200 unmapped: 36102144 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125755392 unmapped: 36093952 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: bluestore.MempoolThread _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1124352 data_alloc: 218103808 data_used: 57344
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: osd.2 151 heartbeat osd_stat(store_statfs(0x4f9efa000/0x0/0x4ffc00000, data 0x107def/0x1c2000, compress 0x0/0x0/0x0, omap 0x63a, meta 0x5b3f9c6), peers [0,1] op hist [])
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: do_command 'config diff' '{prefix=config diff}'
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: do_command 'config show' '{prefix=config show}'
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: do_command 'counter dump' '{prefix=counter dump}'
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125198336 unmapped: 36651008 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: do_command 'counter schema' '{prefix=counter schema}'
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 36634624 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: prioritycache tune_memory target: 4294967296 mapped: 125132800 unmapped: 36716544 heap: 161849344 old mem: 2845415832 new mem: 2845415832
Oct 10 06:30:48 np0005479823 ceph-osd[77423]: do_command 'log dump' '{prefix=log dump}'
Oct 10 06:30:48 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Oct 10 06:30:48 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2013373529' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Oct 10 06:30:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:48 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:48 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:30:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:30:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:48 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:30:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:49 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:30:49 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:49 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:49 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:49.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:49 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Oct 10 06:30:49 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1092835265' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Oct 10 06:30:49 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Oct 10 06:30:49 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2000313604' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Oct 10 06:30:49 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Oct 10 06:30:49 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3657722192' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Oct 10 06:30:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:49 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:49 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:50 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:50 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:50 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:50.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:30:50 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Oct 10 06:30:50 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2779081294' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Oct 10 06:30:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:50 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:50 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:51 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Oct 10 06:30:51 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3755513474' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Oct 10 06:30:51 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:51 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:51 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:51.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:51 np0005479823 nova_compute[235775]: 2025-10-10 10:30:51.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:30:51 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Oct 10 06:30:51 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2752456730' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Oct 10 06:30:51 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Oct 10 06:30:51 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/251813461' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Oct 10 06:30:51 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Oct 10 06:30:51 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/821557892' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Oct 10 06:30:51 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Oct 10 06:30:51 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1210927648' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Oct 10 06:30:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:51 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:51 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:52 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:52 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:52 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:52.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:52 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Oct 10 06:30:52 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3064887336' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Oct 10 06:30:52 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Oct 10 06:30:52 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2197365172' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Oct 10 06:30:52 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Oct 10 06:30:52 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2319149536' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Oct 10 06:30:52 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Oct 10 06:30:52 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1413997151' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Oct 10 06:30:52 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Oct 10 06:30:52 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2285275171' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Oct 10 06:30:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:52 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:52 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:53 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Oct 10 06:30:53 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1814715029' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Oct 10 06:30:53 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Oct 10 06:30:53 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3898864091' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Oct 10 06:30:53 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Oct 10 06:30:53 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2844202342' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Oct 10 06:30:53 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:53 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:53 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:53.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:53 np0005479823 nova_compute[235775]: 2025-10-10 10:30:53.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:30:53 np0005479823 systemd[1]: Starting Hostname Service...
Oct 10 06:30:53 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Oct 10 06:30:53 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/112314830' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Oct 10 06:30:53 np0005479823 systemd[1]: Started Hostname Service.
Oct 10 06:30:53 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Oct 10 06:30:53 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4280476709' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Oct 10 06:30:53 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Oct 10 06:30:53 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/853849376' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Oct 10 06:30:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:53 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:53 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Oct 10 06:30:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Oct 10 06:30:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:53 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Oct 10 06:30:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-nfs-cephfs-1-0-compute-2-boccfy[242006]: 10/10/2025 10:30:54 : epoch 68e8dc95 : compute-2 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Oct 10 06:30:54 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:54 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:54 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:54.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:54 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Oct 10 06:30:54 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2382901546' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Oct 10 06:30:54 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Oct 10 06:30:54 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/2698340361' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Oct 10 06:30:54 np0005479823 podman[262322]: 2025-10-10 10:30:54.81071001 +0000 UTC m=+0.080463380 container health_status 3561bc530be6f2b8e438ff446c780ec26ad4170655588b7fc013d4010ddd718c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct 10 06:30:54 np0005479823 podman[262324]: 2025-10-10 10:30:54.825709769 +0000 UTC m=+0.091725319 container health_status e4e4615a2f6559de2bc0f7e12ea95470b2fd0b7d60550bc861ae32a2a32dd1d0 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid)
Oct 10 06:30:54 np0005479823 podman[262323]: 2025-10-10 10:30:54.868677622 +0000 UTC m=+0.137401139 container health_status 470f013157f7d63fff6eea26a8c5b100401702469b7b0d8b452eeb1cf92efdd3 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 10 06:30:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:54 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:54 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:55 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:55 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:30:55 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:55.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:30:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Oct 10 06:30:55 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Oct 10 06:30:55 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3888181591' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Oct 10 06:30:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:55 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:55 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:56 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Oct 10 06:30:56 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2207188954' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Oct 10 06:30:56 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:56 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.000000000s ======
Oct 10 06:30:56 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:56.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Oct 10 06:30:56 np0005479823 nova_compute[235775]: 2025-10-10 10:30:56.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:30:56 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Oct 10 06:30:56 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/988730199' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Oct 10 06:30:56 np0005479823 ceph-mon[74913]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Oct 10 06:30:56 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1923095157' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Oct 10 06:30:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:56 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:56 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:57 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct 10 06:30:57 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Oct 10 06:30:57 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:57 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:30:57 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.102 - anonymous [10/Oct/2025:10:30:57.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:30:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-rgw-default-compute-2-bbeizy[86419]: Fri Oct 10 10:30:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:57 np0005479823 ceph-21f084a3-af34-5230-afe4-ea5cd24a55f4-keepalived-nfs-cephfs-compute-2-fcbgvm[85839]: Fri Oct 10 10:30:57 2025: (VI_0) received an invalid passwd!
Oct 10 06:30:58 np0005479823 radosgw[83867]: ====== starting new request req=0x7f3d7688c5d0 =====
Oct 10 06:30:58 np0005479823 radosgw[83867]: ====== req done req=0x7f3d7688c5d0 op status=0 http_status=200 latency=0.001000032s ======
Oct 10 06:30:58 np0005479823 radosgw[83867]: beast: 0x7f3d7688c5d0: 192.168.122.100 - anonymous [10/Oct/2025:10:30:58.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Oct 10 06:30:58 np0005479823 nova_compute[235775]: 2025-10-10 10:30:58.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct 10 06:30:58 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Oct 10 06:30:58 np0005479823 ceph-mon[74913]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
